May 2009

Poster for Toronto Roller Derby, May 30, 2009 - "Chicks Ahoy! vs Gore-Gore Rollergirls"

Tomorrow at the George Bell Arena in Accordion City’s west end, I’ll be playing (and singing) Canada’s National Anthem at the start of the Toronto Roller Derby match between the Gore-Gore Rollergirls and Chicks Ahoy! I’m working on a rendition that respects the proud heritages of both Canada and roller derby.

Map picture

If you’d like to hear my rockin’ accordion rendition of O Canada and enjoy some roller derby (the motto is “Real Women. Real Hits. Real Heart.”), it’s all happening at the George Bell Arena (215 Ryding Avenue, in the “Junction” neighbourhood). Tickets are $12 in advance, $15 at the door and kids under 7 get in free. The doors open at 6:30 p.m. and the game starts at 7:30, which is when I figure I’ll be playing.

For those of you who can’t remember or don’t know the words to our anthem, here they are:

O Canada!
Our home and native land!
True patriot love
In all thy sons command.

With glowing hearts
We see thee rise,
The True North strong and free!
From far and wide,
O Canada,
We stand on guard for thee.

God keep our land
Glorious and free!
O Canada, we stand on guard for thee.
O Canada, we stand on guard for thee.

I’ll post my chord arrangements later.

And, as further enticement, here are the Gore-Gore Rollergirls:

The Gore-Gore Rollergirls

{ 8 comments }

Desktop "tower" computer with coffee machine built in

Yes, Toronto Coffee and Code returns today! It’s a day when I make myself (and by extension, Microsoft) very accessible by working out of a cafe and answering your questions, getting your opinions and sharing ideas. Once again, it’ll take place at the Dark Horse Cafe at 215 Spadina, near the lights between Dundas and Queen.

I’ve got some stuff to do at Microsoft Canada HQ in the morning, so I’m declaring tomorrow’s schedule to run from 1:00 p.m. to 6:00 p.m.. I might be there earlier, but I thought I’d give myself extra time in case traffic decides not cooperate. I’ll see you there!

{ 0 comments }

Fast Food Apple Pies and Why Netbooks Suck

by Joey deVilla on May 26, 2009

Yup, another article originally published in my tech blog, Global Nerdy. As with the previous two, this one is of interest not just to programmers, but anyone using portable and mobile computing devices, such as smartphones, netbooks and laptops.

If you’re pressed for time, the graphic below – which takes its inspiration from these articles by Kathy “Creating Passionate Users” Sierra — captures the spirit of this article rather nicely:

Kathy Sierra-esque graph showing  the relative positions of the smartphone (great for when you're on the go), the laptop (great for when you're sitting down) and in between, the netbook (zone of suck)

If you have a little more time to spare, I’m going to explain my belief that while netbooks have a nifty form factor, they’re not where the mobile computing action is.

A Tale of Two Pies

When I was Crazy Go Nuts University’s second most notorious perma-student (back in the late ‘80s/early ‘90s), I took a handful of business courses at the recommendation of my engineering and computer science professors. “You’re going to have to learn to speak the suits’ language,” they said. Crazy Go Nuts University has a renowned business school and I thought it would be a waste not to take at least a couple of business courses. I especially liked the Marketing couse, and one lecture stands out in my mind: a case study comparing the dessert offerings of two major fast food chains.

In the interest of not attracting the attention of their lawyers, I’m going to refer to the chains as:

  • Monarch Burger, whose mascot is a mute monarch with a glazed-over face, wearing a crown and associated paraphernalia, and
  • Jester Burger, whose mascot is a clown in facepaint and a brightly-coloured jumpsuit who loves to sing and dance.

Both Monarch Burger and Jester Burger offered a dessert that went by the name “apple pie”. Let’s examine them.

Monarch Burger’s Pie

Monarch Burger's apple pie: a slice of pie served in a wedge-shaped box Monarch Burger went to the trouble of making their apple pie look like a slice of homemade apple pie. While it seems appealing in its photo on the menu, it sets up a false expectation. It may look like a slice of homemade apple pie, but it certainly doesn’t taste like one. Naturally, it flopped. Fast-food restaurants are set up to be run not by trained chefs, but by a low-wage, low-skill, disinterested staff. As a result, their food preparation procedures are designed to run on little thinking and no passion. They’re not set up to create delicious homemade apple pies.

Jester Burger’s Pie

Jester Burger's apple pie: a tube of pastry, whose skin is pocked from deep-frying

Jester Burger’s approach was quite different. Their dessert is called “apple pie”, but it’s one in the loosest sense. It’s apple pie filling inside a pastry shell shaped like the photon torpedo casings from Star Trek. In the 70s and 80s, the pastry shell had bubbles all over it because it wasn’t baked, but deep-fried. After all, their kitchens already had deep fryers aplenty – why not use them?

Unlike Monarch Burger’s offering, Jester Burger’s sold well because it gave their customers a dessert reminiscent of an apple pie without setting up any expectations for real apple pie.

Jester Burger’s pie had an added bonus: unlike Monarch Burger’s pie, which was best eaten with a fork, Jester Burger’s pie was meant to be held in your hand, just like their burgers and fries.

At this point, I am obliged to remind you that this isn’t an article about 1980s-era desserts at fast food burger chains. It’s about netbooks and smartphones, but keep those pies in mind…

Netbooks are from Monarch Burger…

Netbooks remind me of Monarch Burger’s apple pie. Just as Monarch Burger tried to take the standard apple pie form and attempt to fit it into a fast food menu, the netbook approach tries to take the standard laptop form and attempt to fit it into mobile computing. The end result, to my mind, is a device that occupies an uncomfortable, middle ground between laptops and smartphones that tries to please everyone and pleases no one. Consider the factors:

  • Size: A bit too large to go into your pocket; a bit too small for regular day-to-day work.
  • Power: Slightly more capable than a smartphone; slightly less capable than a laptop.
  • Price: Slightly higher than a higher-end smartphone but lacking a phone’s capability and portability; slightly lower than a lower-end notebook but lacking a notebook’s speed and storage.

To summarize: Slightly bigger and pricier than a phone, but can’t phone. Slightly smaller and cheaper than a laptop, but not that much smaller or cheaper. To adapt a phrase I used in an article I wrote yesterday, netbooks are like laptops, but lamer.

Network Computers and Red Herrings

Sun's "JavaStation" network computer

The uncomfortable middle ground occupied by the netbook reminds me of another much-hyped device that flopped – the network computer, which also went by the name "thin client". In the late 90s, a number of people suggested that desktop computers, whose prices started at the mid-$1000 range in those days, would be replaced by inexpensive diskless workstations. These machines would essentially be the Java-era version of what used to be called "smart terminals", combining local processing power with network-accessed storage of programs and data.

A lot of the ideas behind the network computer ended up in today’s machines, even if the network computer itself didn’t. Part of the problem was the state of networking when the NC was introduced; back then, broadband internet access was generally the exception rather than the rule. Another major factor was price – desktop and even laptop computers prices fell to points even lower than those envisioned for NCs. Finally, there was the environment in which the applications would run. Everyone who was betting on the NC envisioned people running Java apps pushed across the network, but it turned out that the things they had dismissed as toys — the browser and JavaScript, combining to form the juggernaut known as Ajax — ended up being where applications "lived".

When I look at netbooks, I get network computer deja vu. I see a transitory category of technology that will eventually be eclipsed. I think that laptops will eventually do to netbooks what desktop machines did to network computers: evolve to fill their niche. Just as there are small-footprint desktop computers that offer all the functionality and price point of a network computer along with the benefits of local storage, I suspect that what we consider to be a netbook today will be just another category of laptop computer tomorrow.

A netbook displaying a picture of a red herring on its screen

I’m going to go a little farther, beyond stating that netbooks are merely the present-day version of the network computer. I’m going to go beyond saying that while their form factor is a little more convenient than that of a laptop, the attention they’re getting – there’s a lot of hoo-hah about who’s winning in the netbook space, Windows or Linux –  is out of proportion to their eventual negligible impact. I’m going to go out on a limb and declare them to be a dangerous red herring, a diversion from where the real mobile action is.  

…and Smartphones are from Jester Burger

Southern Chicken Place's apple pie, which looks a lot like Jester Burger's apple pie

A quick aside: The photo above is not of a Jester Burger fried apple pie. In response to their customers’ so-called health concerns (really, if those concerns were real, they’d stop eating there), they started phasing out the fried pies in 1992 in favour of the baked kind. There are still some branches of Jester Burger that carry the fried pies, but a more reliable source is a fast food chain that I’ll refer to as “Southern Chicken Place”, or SCP for short. Those pies in the photo above? They’re from SCP.

Jester Burger made no attempt to faithfully replicate a homemade apple pie when they made their dessert. Instead, they engineered something that was “just pie enough” and also matched the environment in which it would be prepared (a fast food kitchen, which didn’t have ovens but had deep fryers) and the environment in which it would be eaten (at a fast food restaurant table or in a car, where there isn’t any cutlery and everything is eaten with your hands). The Jester Burger pie fills a need without pretending to be something it’s not, and I think smartphones do the same thing.

Smartphones are truly portable. They really fit into your pocket or hang nicely off your belt, unlike netbooks:

Two Japanese models trying to stuff a Sony Vaio netbook into their pockets

And smartphones are meant to be used while you’re holding them:

Captain Kirk, his communicator and the iPhone

Just try that with a netbook. In order to really use one, you’ve got to set it down on a flat surface:

Guy using his netbook, perched on the roof of his car...with a stylus, no less!

The best smartphones make no attempt to faithfully replicate the laptop computer experience in a smaller form. Instead, they’re “just computer enough” to be useful, yet better fit the on-the-go situations in which they will be used. They also incorporate mobile phones and MP3s – useful, popular and familiar devices — and the best smartphones borrow tricks from their user interfaces.

Smartphones, not netbooks, are where the real advances in mobile computing will be made.

Smartphone vs. Netbook: The People Have Chosen

One again, the thesis of this article, in graphic form:

Same graph as the earlier Kathy Sierra-esque one at the start of the article.

In the late 80s and early 90s, the people chose the fast food apple pie they wanted: the convenient, if not exactly apple pie-ish Jester Burger pie over Monarch Burger’s more-like-the-real-thing version.

When people buy a smartphone, which they’ve been doing like mad, they’re buying their primary mobile phone. It’s the mobile phone and computing platform that they’re using day in and day out and the device that they’re pulling out of their pockets, often to the point of interrupting conversations and crashing the trolley they’re operating.

When people buy a netbook, they’re often not buying their primary machine. It’s a second computer, a backup device that people take when their real machine – which is often a laptop computer that isn’t much larger or more expensive – seems like too much to carry. It’s a luxury that people might ditch if the current economic situation continues or worsens and as the differences between laptops and netbooks vanish. Netbooks, as a blend of the worst of both mobile and laptop worlds, will be a transitional technology; at best, they’ll enjoy a brief heyday similar to that of the fax machine.

The people are going with smartphones, and as developers, you should be following them.

{ 151 comments }

Mental Models, Mantras and My Mission

by Joey deVilla on May 25, 2009

This article also appears in Global Nerdy. Like the previous article, it’s about my role at Microsoft and doesn’t delve too deeply into technology, so I thought it was suitable for a more general audience and decided to republish it here. Enjoy!

Mental Models and Bill Buxton’s “Draw a Computer” Exercise

Bill Buxton

In the mid 1990s, well before he was Microsoft’s user interface guru, Bill Buxton often asked people to carry out a simple little exercise: draw a picture of a computer. Most, if not all, of the people he asked would draw something that fit the common mental model of the desktop computer of the era: cathode ray tube-type monitor, keyboard, mouse and that box housing the motherboard and drives that many people mistakenly refer to as “the CPU”.

If Buxton were to ask the question today, the drawings of computers might look like these:

Four computers from the 2000s - a laptop, a couple of all-in-one-desktops and a desktop with a "box" -- all with flat screens

If he asked the question in the mid-to-late 1980s, the drawings might’ve looked like these:

80s-era computers: Apple ][, Commodore 64, TRS-80 and IBM PC

And had he asked the question in the mid-60s, the drawings might’ve looked like this:

The classic fake "home computer as envisioned by RAND" photo

Buxton likes to point out that the changes in computers from the 60s onwards are largely in the implementation technology, processing power and outward appearance. When most people draw computers, he said, they’re merely drawing their mental model, which is based on the outer packaging.

However, if you use the mental model of a technologist, computers have been essentially the same instruction/ALU/storage/input-output boxes whether they’ve occupied whole rooms or fit in your pocket. They’ve been pretty much the same at their core, in the same way that fancy tech and hybrid engine aside, there really isn’t too much that separates a present-day Toyota Prius from a Model T Ford.

If Bill Buxton could approach Microsoft Corporation as a person — and hey, that’s the way the law treats corporations, so why not? – and asked him/her to draw a computer, I suspect that s/he would draw something based on mental model of a souped-up circa 2000 computer: a desktop computer with a nice flatscreen monitor, running Windows XP and having a somewhat limited connection to the ‘net.

I think that this is a problem. I also think that the source of this problem is Microsoft’s success.

Microsoft’s Company Mantras

“A PC on every desk and in every home” was Microsoft’s longest-lived slogan and the company mantra for the first 24 years of existence. Like the best slogans, it succinctly summarized the company’s goal. The problem is that the goal has pretty much been reached. In most parts of the first world, a good chunk of the second world and even a sizeable fraction of the third world, you can easily find a desktop computer, and it’s quite likely that it’s running some sort of Microsoft software.

Since 1999, the company mantra – I really hesitate the use the phrase “vision statement” — has been a little more vague. The company’s been thrashing between them a little more frequently, as you can see in this list of mantras taken from chapter 1 of How We Test Software at Microsoft:

  • 1975 – 1999: “A PC on every desk and in every home.”
  • 1999 – 2002: “Empowering people through great software – any time, any place and on any device.”
  • 2002 – 2008: “To enable people and businesses throughout the world to realize their full potential.”
  • 2008 – present: “Create experiences that combine the magic of software with the power of internet services across the world of devices.”

The post-1999 mantra all seem a little limp in comparison to the original. Reading them, I cannot help but think of a quote attributed to web design guru Jeffrey Zeldman:

"…provide value added solutions" is not a mission. "Destroy All Monsters." That is a fucking mission statement.

Because the old mantra lasted for so long and the new mantras just don’t have the same straightforwardness and gravitas (How We test Sofware at Microsoft quotes Ballmer as saying that we may never again have a clear statement like the original to guide the company), the original remains quite firmly etched in the company culture and mindset.

I think it’s holding us back.

The Desktop as the Goose That Laid the Golden Egg

Altair 8800 computer on display at Microsoft's Building 92 gallery

The original mantra doesn’t just focus on the desktop, it actually mentions it by name. In 1975, when computers were room-filling behemoths that you could access either via batch or time-share, the concept of a desktop computer was downright radical. If you think the iPhone is impressive (and yes, it is), imagine how mind-blowing the Altair 8800, the first commercially-available desktop computer, must have been to a geek back in the Bad Old Days. It was the platform on which Microsoft’s first product – a little programming language called Altair BASIC – was launched, and it was BASIC that in turn launched the company.

In his book Outliers, Malcolm Gladwell talks about how the Altair 8800 was a golden opportunity for Bill Gates and his buddies at his fledgling company, then called “Micro-Soft”. Unlike a lot of other companies at the time, they took the desktop computer seriously. Even when IBM got into the desktop computer game in 1981, it was a product of their Entry-Level Systems division, a clear indication that they thought the PC was a machine you bought until you were ready to graduate to a real computer. I don’t think that this philosophy ended up serving them well.

An Applesoft BASIC cassette featuring a sticker that says "Copyright Microsoft, 1977"

Since the big boys were paying no mind to the desktop computer, upstarts like Microsoft had a big empty field in which to play, and they thrived. Crack open just about any late 70s/early 80s computer that had BASIC built in – even Apple machines — and you’ll see a row of ROM chips with a Microsoft copyright notice. It was Microsoft that swooped in with PC-DOS when a deal with Digital Research for a PC version of CP/M was slow in coming (and this is despite the fact that Gates recommended that IBM go to Digital for an OS). A lot of people’s experience with desktop computers (and Microsoft revenue) is defined by circa-1995 Microsoft thanks to Windows 95 and the results of Bill Gates’ memo titled The Internet Tidal Wave, both of whose influences are still felt to this day.

Once upon a time, it used to be unusual to walk into someone’s home or office and see a computer. These days, it’s unusual to walk into someone’s home or office and not see a computer, and Microsoft’s focus on the desktop had a lot to do with that.

The Desktop as Albatross

Albatross, shot with a sucker-dart arrow, falls on the head of a Disney-esque cartoon character

When electric motors first became available, engineers envisioned factories and eventually houses being equipped with a single electric motor. They imagined that the central motor would, through a series of gears and drive belts, be connected to whatever machines in the house or factory had to be driven by it. What happened in the end is that rather than relying on some central motor, electric motors “disappeared” into the devices that used them. Here’s an exercise to try: go and count the electric motors in your house or apartment right now. The number should be a couple dozen, and if you can’t find them, this article might help.

When big, room-filling computers first became available, engineers envisioned businesses being equipped with a single computer in a manner roughly analogous to the aforementioned big central motor. We know what happened in the end – while many businesses do make use of big datacenters, a lot of the computing power got spread out into desktop computers.

I have a theory that comes in two parts:

  1. Just as electrical motors disappeared into the devices that needed their work, and just as computing power got spread out from big mainframes into desktop machines, computing power is now both disappearing and spreading out into mobile devices and the web/cloud.
  2. Microsoft, with its desktop-centric approach, at least outwardly appears to be missing out on this migration of computing power.

Most of the company’s attention, at least to an outside observer, seems to be focused on Windows 7. Yes, chances are that with computer sales being what they are, Windows 7 will probably end up on more of laptops and netbooks than desktops, but I consider those devices to simply be the desktop computer in a more portable form. It worries me that there have been more concrete announcements about Windows 7 on netbooks than upcoming versions of Windows Mobile, despite the iPhone and BlackBerry-driven evidence that the real mobile action is in smartphones.

(Tomorrow, I’ll post an article in which I argue that netbooks are a dangerous red herring pulling away our attention from devices like smartphones.)

Microsoft ASP.NET

Even when the company reaches out beyond desktop development, there’s no escaping the desktop “gravity well”. Consider ASP.NET (that is, the “traditional” ASP.NET, not the recently-released ASP.NET MVC). To my mind, as well as the minds of a lot of other web developers, it’s a web framework that tries really hard to pretend that the web doesn’t exist. It makes use of a whole lot of tomfoolery like ViewState to create a veneer of desktop app-like statefulness over the inherently stateless nature of the web and a programming model that tries to mimic the way you’d write a desktop application. It’s almost as if it were designed with the mantra “the web is like the desktop, but lamer” instead of “the web is like the desktop, but everywhere”. Although the framework works just fine and there are a number of great sites and web apps built on it, I think a lot of developers sensed this design philosophy and went elsewhere for web development.

(An aside: My old boss at OpenCola in late 2001 told me that he’d been meeting with Microsoft people and suspected that Internet Explorer 6 would be the final version of their browser. The expectation that web pages and web applications would be replaced by Windows client applications pushed over the net, a prediction similar to one made by the Java folks a few years prior.)

Windows Mobile logo

The same situation exists with Windows Mobile’s current user interface, which is basically a subset of Windows’ standard UI controls for the desktop, scaled down to fit smaller screens, and with a stylus standing in for the mouse. It’s almost as if it were designed with the mantra “mobile computing is like desktop computing, but lamer” instead of “mobile computing is like a mobile phone plus PDA and an MP3 player, but cooler.” If the ASP.NET design mantra is a whisper, the Windows Mobile mantra is a scream.

I suspect that the reason the XBox 360 didn’t fall into a similar kind of trap — “set-top boxes are like desktop computers, but lamer and only for games” – is that the XBox team is situated off the Microsoft Campus and less susceptible to the desktop influence.

My Mission

Stick figure, chained to desk, breaking the chain

At my most recent one-on-one meeting with my manager John Oxley, we talked about a need for each member of our Evangelism team to define his or her area of focus. The Microsoft platform is a vast, nerdy expanse spanning the range from embedded computing all the way to Cray supercomputers; no single person can hope to cover it all.

He already had a good idea of what I wanted to focus on, and by now, I guess you do as well. I feel that just as computing expanded beyond the big computer rooms and onto our desktops, computing is expanding beyond our desktops into all sorts of different places:

  • Invisibly, into the web and cloud in the form of web applications and services
  • Visibly, into our pockets and living rooms, and embedded into all sorts of real-world things

While I believe that Windows 7 is a necessary part of the Microsoft platform, I’m not too worried about focusing on it – there are more than enough people at the company to promote and evangelize it. I want to focus on the platforms that I feel that Microsoft hasn’t given enough love and attention: the non-desktop platforms of the web, mobile and gaming, as well where they intersect.

It’s a big area to cover, but I think Microsoft needs to be active in this area if it wants to be true to its forward-looking roots. I even have a mantra for it: “To help web, mobile and game developers using Microsoft tools go from zero to awesome in 60 minutes.” I want to give developers both that rush when getting started with a new technology as well as the sustained passion to keep working with it, in the same way that Ruby on Rails and the iPhone got developers with an initial flash of excitement and turned it into long-term passion. It’s an ambitious, audacious mission, but no more so than the one coined by a bunch of scruffy nerds in New Mexico in the the 1970s: “A PC on every desk and in every home.”

Joey deVilla with cardboard cutouts of Microsoft's 1978 team

{ 3 comments }

Evangelist, Immigrant and Shaman

by Joey deVilla on May 24, 2009

This article also appears in Global Nerdy. I thought it might be of interest not just to geeks, but also to people who are thinking about defining their roles at work.

This week, Microsoft Canada’s Developer and Platform Evangelism team is getting together to do its planning for the upcoming financial year, which runs from July to June in The Empire. There’s a lot to talk about, especially in a year that combines the Credit Crunch, the releases of new versions of Windows, Windows Mobile, Visual Studio and who-knows-what-else and a company looking to establish its place in an increasingly web- ad mobile-driven world.

A good place to start might be to think about the roles that we, as individual members of the Evangelism team, play.

Evangelist

Old colorized photo of a boy evangelist with the title "I've got a message!"

Unlike Anil Dash and Jeff Atwood, I never had any reservations about the job title “Evangelist”. The religious connotations never bothered me. It might have had something to do with spending eight years in a Catholic school — it didn’t do me any harm, and it didn’t seem to hurt Keanu, who went to the same school around the same time. It might also have something to do with the fact that like Atwood, I think that “Software development is a religion, and any programmer worth his or her salt is the scarred veteran of a thousand religious wars.” I could never be happy with only programming; I need to mix it with sharing the knowledge and passion for the craft through writing, speaking, schmoozing, performing and entertaining.

Like evangelism of the religious kind, being a technical evangelist isn’t a job that you can do “on autopilot”. There are some jobs that you can do and even excel even though you hate them and the work is of no interest to you. No doubt you’ve seen or know people who do their jobs “on autopilot”, functioning well enough to perform the tasks required of them. Evangelism isn’t one of them. As the title implies, if you don’t have the believe in what you’re talking about, if you don’t have faith – you can’t get the job done. Evangelism is about winning hearts and minds, and people just know when you’re faking it, and once they know, they’ll never listen to you again.

Guy Kawasaki

I’ve wanted be a technical evangelist ever since I learned about Guy Kawasaki, who held the title at Apple in the mid 1980s. He may not have invented the title or the position – credit for that has to go to Mike Boich, Guy’s buddy at Apple – but he popularized the term and set the standard. The job engages both what we colloquially refer to as the “left brain” and the “right brain”; it requires you to tap into your rational and creative sides, often simultaneously. It’s the sort of work that I can really sink my teeth into. It is my dream job.

Nobody questions my suitability as an evangelist. People have asked about my suitability as an evangelist for Microsoft. How can a guy who’s been working largely in the open source world for the past seven or so years, mostly on a Mac, be an evangelist for The Empire?

Immigrant

Immigrant family on Ellis Island looking at the Statue of Liberty in the distance

I came to appreciate Microsoft’s tools after leaving my first job. In 1997, my friend Adam P.W. Smith and I left multimedia development at a shop called Mackerel, to go try my hand at building “real” applications at our own little consultancy. We wanted to graduate from building multimedia apps for marketing and entertainment purposes – software you might run once or twice and then discard — and start building applications that people would use in their everyday work to get things done.

Despite being Mac guys at heart, we chose the Windows platform since that’s what our customers were using, and opted to use Visual Basic to build our apps. Although it was considered “the Rodney Dangerfield of programming tools”, Visual Basic in the pre-.NET era was the best tool for producing great applications in a timely fashion that both we (and our customers, since they got the source code) could easily maintain. Our longest-lived application, a database of every mall in America written for National Research Bureau in Chicago, was first written in 1998 and its codebase lived on until a couple of years ago. In today’s world of ephemeral Web 2.0 apps, that’s an Old Testament lifetime.

Splash screens for "HPS Training System" and "Shopping Center Directory on CD-ROM"

Just as the best immigrants bring a little bit of their home culture and add it to the mix in their newly-adopted country, we decided to bring Macintosh user interface and workflow culture to the Windows world. We took care to write user-friendly error messages and also structured our applications so that you wouldn’t see them often. Our layout was consistent and everything was clearly labelled so you never felt lost in the application. And yes, we sweated over aesthetics because we felt that beautiful tools lead to better work.

Here’s the original application that we were given as a guide:

Original crappy SCD screen 

…and here’s our rewritten-and-redesigned-from-the-ground-up app that we built for National Research Bureau:

New and improved SCD main screen

(For more on what we did, visit the page where we showcase our work.)

A decade later, I find myself an immigrant in the world of Windows development, and once again, I want to bring a bit of the cultures from which I came and add it to the mix. This time, that culture is from Build-on-Mac-Deploy-on-Linux-istan, a cultural crossroads which blends a strong design aesthetic with the focus on the web, mobile applications, unit testing, distributed version control, sharing code and a scrappy startup work ethic and spirit. At the same time, I see the potential in my new Microsoft homeland, with its expansive reach into just about every level of computing, from embedded systems to giant enterprise datacentres, its excellent IDEs and frameworks and its large developer base. As an “immigrant” Microsoft evangelist, I see the chance for me to ply my trade in a new land that needs my skills, energy and outside perspective, and earn a fair reward for my efforts.

Shaman

Shaman holding a Windows 7 logo

I’ve been trying to take how I see my role at Microsoft and distill it into a single idea, perhaps even a single word. The term “Change Agent”, which appeared all over the place in early issues of Fast Company captures a lot of what I’m trying to express, but it feels sort of clumsy and doesn’t have that summarize-a-big-concept-in-a-single-word oomph that “Evangelist” has.

Luckily for me, my friend Andrew Burke was reading an editorial in Penny Arcade which had the perfect word:

What Microsoft needs badly is a shaman. They need somebody who is situated physically within their culture, but outside it spiritually. This isn’t a person who hates Microsoft, but it’s a person who can actually see it. I can do this for you. Give me a hut in your parking lot. I will eat mushrooms, roll around in your cafeteria, and tell you the Goddamned truth.

That’s not bad. There are a number of ways in which “shaman” might be more applicable than “evangelist”.

Family photo where everyone except one kid is dressed in their Sunday best; one kid us dressed like a biker/metal dude.

For starters, I am situated physically within Microsoft’s culture, but in many ways I’m outside it spiritually. This is thanks to the fact that I’m a mobile worker and don’t have a cubicle within Microsoft’s offices and to my manager John Oxley’s efforts to keep me from getting too deeply entrenched within the culture. I was hired partly for my outsider’s perspective, and for me to be effective, I need to maintain some of my “outsideness”. This perspective makes me able to do or see things that a hardcore Microsoftie might not consider (such as Coffee and Code) or perceive (such as the rise of the iPhone, while Steve Ballmer said that “There’s no chance that the iPhone is going to get any significant market share”).

"Mediator" photo: guy in suit acting as a referee for two guys in suits arm-wrestling

Unlike religious evangelists, shamen are mediators. While an evangelist’s communication is typically one-way, from the supernatural to the people, the shaman not only speaks on behalf of the supernatural to the people to influence them, but also on behalf of people to the supernatural to influence it back. If I am only evangelizing to developers on behalf of Microsoft, I’m only doing half my job. I also need to evangelize to Microsoft on behalf of the developer community.

When I joined Microsoft, a number of my friends suggested that I’d be good at changing the company from the inside. I think that that task is better left to the people who either develop its technologies or strategy; as an Evangelist – er, Shaman – I am better positioned to change the company from the outside. Think about it: a good chunk of what makes a platform is its developer community; without it, it’s just sits there. Without their developer communities, Windows wouldn’t have become the dominant desktop system, Linux wouldn’t have become the dominant web OS and the iPhone would be another Nokia N-Gage. Developers shape the platform just as much as the platform vendor, and they do it best when they have a conduit to their platform vendor – a shaman.

Package for the Nintendo game "Captain Planet and the Planeteers"

For some religions, the position of shaman is also an ecological one, and as a developer evangelist so is mine. According to Wikipedia, some shamen “have a leading role in this ecological management, actively restricting hunting and fishing”. I am charged with making sure that Canada’s developer ecology is a healthy one; in fact, when I was hired, I was told that I was hired “for Canada first, and Microsoft second.”

A healthy, thriving developer ecosystem is good for the field, which in turn is good for Microsoft. As a developer who likes to participate in the community, I have an active interest in keeping the ecosystem healthy, and a Microsoft that contributes positively to that ecosystem is a good thing. The nurturing of ecosystems isn’t covered by evangelism, but it certainly falls under a shaman’s list of tasks.

Wide-eyed LOLcat hiding: "Bad trip kitteh wishes furniture would just stay in one place."

And finally, the idea of eating mushrooms and rolling around the Microsoft cafeteria is intriguing. I doubt that they’d tolerate me playing my accordion while high as a kite, wearing nothing but body paint and assless chaps, rolling all over the salad bar and smothering myself with cottage cheese. It is an amusing idea, though.

{ 1 comment }

Ben_Bernanke

First, a quick “congratulations!” to my brother-in-law on graduating from Boston College’s School of Law! Law School ain’t easy, and graduating cum laude is even tougher. Well done – I salute you with a filet mignon on a flaming sword!

The graduation ceremony took place on the Boston College campus. BC has considerable “juice”, and as such, they were able to land an pretty high-up commencement speaker: Ben Bernanke, Chairman of the Board of Governors of the United States Federal Reserve (a.k.a. “The Fed”) and “Fourth Most Powerful Person in the World” according to Newsweek. It’s the first commencement ceremony I’ve ever been to that had the Secret Service (or whatever federal cops get assigned to high-ranking non-Executive Branch people) present.

There were no great revelations in Bernanke’s speech; I was hoping that he’d reveal some plan for saving the economy by decoding the hints hidden in the U.S. Constitution that would lead us to a hidden stash of gold, a la the National Treasure movies, but no such thing happened. If he has new ideas for bringing about economic recovery, he didn’t give them away – in fact, he told the business reports that they might as well go get some coffee, as he wasn’t going to cover them.

bernanke_looks_on Instead, his speech has the usual platitudes and advice – you’re at the start of a great journey, be flexible. embrace change, don’t be afraid to go outside your comfort zone – modified to suit the times, including the obligatory reference to the current state of the economy, stating that it has ‘dominated my waking hours” for the past twenty-one months. He also talked about how he went from a South Carolina boy with no expectations to move far away from home to his current position, providing some biographical information which I didn’t know before. It wasn’t a bad speech – I’ve heard longer and less interesting ones at other commencements – but aside from a J.K. Rowling joke, it wasn’t anything out of the ordinary.

Here’s the complete transcript of his speech, which you can also find at The Fed’s Board of Governors site:


I am very pleased to have the opportunity to address the graduates of the Boston College Law School today.  I realized with some chagrin that this is the third year in a row that I have given a commencement address here in the First Federal Reserve District, which is headquartered at the Federal Reserve Bank of Boston.  This part of the country certainly has a remarkable number of fine universities.  I will have to make it up to the other 11 Districts somehow.

Along those lines, last spring I was nearby in Cambridge, speaking at Harvard University’s Class Day.  The speaker at the main event, the Harvard graduation the next day, was J. K. Rowling, author of the Harry Potter books.  Before my remarks, the student who introduced me took note of the fact that the senior class had chosen as their speakers Ben Bernanke and J. K. Rowling, or, as he put it, "two of the great masters of children’s fantasy fiction."  I will say that I am perfectly happy to be associated, even in such a tenuous way, with Ms. Rowling, who has done more for children’s literacy than any government program I know of.

I get a number of invitations to speak at commencements, which I find a bit puzzling.  A practitioner, like me, of the dismal science of economics–and it is even more dismal than usual these days–is not usually the first choice for providing inspiration and uplift.  I will do my best, though, and in that spirit I will take a more personal perspective than usual in my remarks today.  The business reporters should go get coffee or something, because I am not going to say anything about the markets or monetary policy.

Instead, I’d like to offer a few thoughts today about the inherent unpredictability of our individual lives and how one might go about dealing with that reality.  As an economist and policymaker, I have plenty of experience in trying to foretell the future, because policy decisions inevitably involve projections of how alternative policy choices will influence the future course of the economy.  The Federal Reserve, therefore, devotes substantial resources to economic forecasting.  Likewise, individual investors and businesses have strong financial incentives to try to anticipate how the economy will evolve.  With so much at stake, you will not be surprised to know that, over the years, many very smart people have applied the most sophisticated statistical and modeling tools available to try to better divine the economic future.  But the results, unfortunately, have more often than not been underwhelming.  Like weather forecasters, economic forecasters must deal with a system that is extraordinarily complex, that is subject to random shocks, and about which our data and understanding will always be imperfect.  In some ways, predicting the economy is even more difficult than forecasting the weather, because an economy is not made up of molecules whose behavior is subject to the laws of physics, but rather of human beings who are themselves thinking about the future and whose behavior may be influenced by the forecasts that they or others make.  To be sure, historical relationships and regularities can help economists, as well as weather forecasters, gain some insight into the future, but these must be used with considerable caution and healthy skepticism.

In planning our own individual lives, we all have a strong psychological need to believe that we can control, or at least anticipate, much of what will happen to us.  But the social and physical environments in which we live, and indeed, we ourselves, are complex systems, if you will, subject to diverse and unforeseen influences.  Scientists and mathematicians have discussed the so-called butterfly effect, which holds that, in a sufficiently complex system, a small cause–the flapping of a butterfly’s wings in Brazil–might conceivably have a disproportionately large effect–a typhoon in the Pacific.  All this is to put a scientific gloss on what you probably know from everyday life or from reading good literature:  Life is much less predictable than we would wish.  As John Lennon once said, "Life is what happens to you while you are busy making other plans."

Our lack of control over what happens to us might be grounds for an attitude of resignation or fatalism, but I would urge you to take a very different lesson.  You may have limited control over the challenges and opportunities you will face, or the good fortune and trials that you will experience.  You have considerably more control, however, over how well prepared and open you are, personally and professionally, to make the most of the opportunities that life provides you.  Any time that you challenge yourself to undertake something worthwhile but difficult, a little out of your comfort zone–or any time that you put yourself in a position that challenges your preconceived sense of your own limits–you increase your capacity to make the most of the unexpected opportunities with which you will inevitably be presented.  Or, to borrow another aphorism, this one from Louis Pasteur:  "Chance favors the prepared mind."

When I look back at my own life, at least from one perspective, I see a sequence of accidents and unforeseeable events.  I grew up in a small town in South Carolina and went to the public schools there.  My father and my uncle were the town pharmacists, and my mother, who had been a teacher, worked part-time in the store.  I was a good student in high school and expected to go to college, but I didn’t see myself going very far from home, and I had little notion of what I wanted to do in the future.

Chance intervened, however, as it so often does.  I had a slightly older friend named Ken Manning, whom I knew because his family shopped regularly at our drugstore.  Ken’s story is quite interesting, and a bit improbable, in itself.  An African American, raised in a small Southern town during the days of racial segregation, Ken nevertheless found his way to Harvard for both a B.A. and a Ph.D., and he is now a professor at MIT, not too far from here.  Needless to say, he is an exceptional individual, in his character and determination as well as his remarkable intellectual gifts.

Anyway, for reasons that have never been entirely clear to me, Ken made it his personal mission to get me to come to Harvard also.  I had never even considered such a possibility–where was Harvard, exactly?  Up North, I thought–but Ken’s example and arguments were persuasive, and I was (finally) persuaded.  Fortunately, I got in.  It probably helped that Harvard was not at the time getting lots of applications from South Carolina.

We all have moments we will never forget.  One of mine occurred when I entered Harvard Yard for the first time, a 17-year-old freshman.  It was late on Saturday night, I had had a grueling trip, and as I entered the Yard, I put down my two suitcases with a thump.  I looked around at the historic old brick buildings, covered with ivy.  Parties were going on, students were calling to each other across the Yard, stereos were blasting out of dorm windows.  I took in the scene, so foreign to my experience, and I said to myself, "What have I done?"

At some level, I really had no idea what I had done, or what the consequences would be.  All I knew was that I had chosen to abandon the known and comfortable for the unknown and challenging.  But for me, at least, the expansion of horizons was exactly what I needed at that time in my life.  I suspect that, for many of you, matriculation at the Boston College law school represented something similar–a leap into the unknown and new, with consequences and opportunities that you could hardly have guessed in advance.  But, in some important ways, leaving the known and comfortable was exactly the point of the exercise.  Each of you is a different person than you were three years ago, not only more knowledgeable in the law, but also possessing a greater understanding of who you are–your weaknesses and strengths, your goals and aspirations.  You will be learning more about the fundamental question of who you really are for the rest of your life.

After I arrived at college, unpredictable factors continued to shape my future.  In college I chose to major in economics as a compromise between math and English, and because a senior economics professor liked a paper I wrote and offered me a summer job.  In graduate school at MIT, I became interested in monetary and financial history when a professor gave me several books to read on the subject.  I found historical accounts of financial crises particularly fascinating.  I determined that I would learn more about the causes of financial crises, their effects on economic performance, and methods of addressing them.  Little did I realize then how relevant that subject would become one day.  Later I met my wife Anna, to whom I have been married now for 31 years, on a blind date.

After finishing graduate school, I began a career as an economics professor and researcher.  I pursued my interests from graduate school by delving deeply into the causes of the Great Depression of the 1930s, along with many other topics in macroeconomics, monetary policy, and finance.  During my time as a professor, I tried to resist the powerful forces pushing scholars to greater and greater specialization and instead did my best to keep as broad a perspective as possible.  I read outside my field.  I did empirical research, studied history, wrote theoretical papers, and established connections, usually in a research or advisory role, with the Fed and other central banks. 

In the spring of 2002, I was asked by the Administration whether I might be interested in being appointed to the Federal Reserve’s Board of Governors.  I was not at all sure that I wanted to take the time from teaching and research.  But this was soon after 9/11, and I felt keenly that I owed my country my service.  Moreover, I told myself, the experience would be useful for my research when I returned to my post at Princeton.  I decided to take a two-year leave to go to Washington.  Well, once again, so much for foresight.  I have now been in Washington nearly seven years, serving first as a Fed governor, then chairman of the President’s Council of Economic Advisers.  In the fall of 2005, President Bush appointed me to be Chairman of the Fed, effective with the retirement of Alan Greenspan at the end of January 2006.

You will not be surprised to hear that events since January 2006 have not been precisely as I anticipated, either.  My colleague, Bank of England Governor Mervyn King, has said that the object of central banks should be to make monetary policy as boring as possible.  Unfortunately, by that metric we have not been successful.  The financial crisis that began in August 2007 is the most severe since the Great Depression, and it has been the principal cause of the global recession that began last fall.  Battling that crisis and trying to mitigate its effect on the U.S. and global economies has dominated my waking hours now for some 21 months.  My colleagues at the Fed and I have been called on to take many tough decisions, including adopting extraordinary and unprecedented policy measures to address the crisis.

I think you will agree that the chain of events that began with my decision to go far from home for college and has culminated–so far–with the role I am playing today in U.S. economic policymaking is so unlikely that we could have safely ruled it out of consideration.  Nevertheless, of course, it happened.  Although I never could have prepared in advance for the specific events of the past 21 months, I believe that my efforts throughout my life to expand my horizons and to keep a broad perspective–for example, to study and write about economic and financial history, as well as more conventional topics in macroeconomics and monetary economics–have helped me better meet the challenges that have come my way.  At the same time, because I appreciate the role of chance and contingency in human events, I try to be appropriately realistic about my own capabilities.  I know there is much that I don’t know.  I consequently try to be attentive to all points of view, to work collaboratively, and to involve as many smart people in policy decisions as possible.  Fortunately, my colleagues and the staff at the Federal Reserve are outstanding.  And indeed, many of them have demonstrated their own breadth and flexibility, moving well beyond their previous training and experience to tackle a wide range of novel and daunting issues, usually with great success.

Law is like economics in that, although it has its own esoterica known only to initiates, it is at bottom a craft whose value lies primarily in its practical application.  You cannot know today what problems or challenges you will face in the course of your professional lives.  Thus, I hope that, even as you continue to acquire expertise in specific and sometimes narrow aspects of the law, you will continue to maintain a broad perspective and willingness, indeed an eagerness, to expand the range of your knowledge and experience.

I have spoken a bit about the economic and financial challenges that we face.  How do these challenges bear on the prospects of the graduates of 2009?  The economic situation is a trying one, as you know.  We are in a recession, and the labor market is weak.  Many of you may not have gotten the job you wanted; some may have had offers rescinded or the start of employment delayed.  I do not minimize those constraints and disappointments in any way.  Restoring economic prosperity and maximizing economic opportunity are the central focus of our efforts at the Fed.

Nevertheless, you are in some ways very lucky.  You have been trained in a field, law, that is exceptionally broad in its compass.  At the Federal Reserve, lawyers are involved in every aspect of our policies and operations–not just because they know the legal niceties, but because they possess analytical tools that bear on almost any problem.  In law school you have honed your skills in reasoning, reading, and writing.  Many of you have work experience or bring backgrounds to bear ranging from history to political science to the humanities to science.  There will always be a need for people with your abilities and talents.

So, my advice to you is to stay optimistic.  Things usually have a way of working out.  My second piece of advice is to be flexible, even adventurous as you begin your careers.  As I have tried to illustrate today, you are much less able than you think to foresee how your life, both professional and personal, will play out.  The world changes too fast, and too many accidents and unpredictable events occur.  It will pay, therefore, to be creative and open-minded as you search for and consider professional opportunities.  Look most carefully at those options that will give you a chance to learn new things, explore new areas, and grow as a person.  Think of every job as a potential investment in yourself.  Will it prepare your mind for the opportunities that chance will provide?

You are lucky also to be living and studying in the United States.  There is a lot of pessimistic talk now about the future of America’s economy and its role in the world.  Such talk accompanies every period of economic weakness.  The United States endured a decade-long Great Depression and returned to prosperity and global leadership.  When I graduated from college in 1975, and from graduate school in 1979, the economy was sputtering, gas prices and inflation were high, and  pessimism–malaise, President Carter called it–was rampant.  The U.S. economy subsequently entered more than two decades of growth and prosperity.  The economy will recover–it has too many fundamental strengths to be kept down for too long–and the mood will brighten.

This is not to ignore real challenges.  Our society is aging, implying higher health-care costs and fiscal burdens.  We need to save more as a country, to reduce global imbalances in saving and investment, and to set the stage for continued growth.  Our educational system is strong in some areas, including our university system, but does not serve everyone equally well, contributing to slower growth and greater income disparities.  In the diverse capacities for which your training has prepared you, many of you will play a vital role in addressing these problems, both in the public and private spheres.

I conclude with congratulations to the graduates, your families, and friends.  You have worked hard and accomplished much.  You have a great deal to look forward to, as many interesting and gratifying opportunities await you.  I hope that as you enter or re-enter the working world, you make sure to stay flexible and open-minded and to learn whenever you can.  That’s the best way to deal with the unpredictabilities that are inherent in life.  I wish you the best of luck, with the proviso that luck is what you make of it.

And perhaps you will advise next year’s class to invite J. K. Rowling.

{ 1 comment }

Then and Now

by Joey deVilla on May 22, 2009

1999 logos: Louis Vuitton, Four Season, Cipriani, Carnegie Hall, Jaguar / 2009 logos:  H&M, Holiday Inn, McDonald's, Netflix, Greyhound

{ 3 comments }