Rochester Institute of TechnologyRIT Home   A-Z Site Index  Infocenter/SIS  RIT Directories RIT Search

Saunders College Quicklinks:  SCB Blogs | Undergraduate | Graduate | Executive | Alumni |

In innovation, many companies still seem to stick with the first idea they have for a new technology. This keeps happening despite all the research and anecdotes that often, one’s first instinct for a technology is the wrong one. For example, “Chemcor,” developed in the late 50′s for “phone booths, prison windows, and eyeglasses“, was shelved for decades until it became ubiquitous in iPhones.

Some of this lack of change is attributed to how projects get approved. Often the research scientist who came up with the idea must submit an application of said idea in order to get funded. (The logic behind the scientist needing to find the application rather than, say, a marketer is an entire other blog topic). Informing his/her superiors of change could be seen as a sign that the project is failing. (Credit to Session 314 on Entrepreneurial Opportunities for inspiring much of this blog).

But there still seems to be a lack of tolerance in many (but not all) businesses for experimentation. Trying multiple possibilities when just one would do is seen as wasteful. And indeed, sometimes we see the same issues in our classes. Students seem distrustful of classroom experimentation, seeing it as busywork or as a sign that the instructor is not fully prepared.

Is it fair to say that experimentation is a bad word in business circles? And if so, what can be done to convince business people of all ages that experimentation is beneficial?

Post originally appeared on Academy of Management blog site.

I woke up this morning to read that Reddit had apologized for some of the blunders made by users searching for the identity of the Boston marathon bombers. Given the scolding given by the FBI to further detective work once the FBI had identified their targets, this proves that crowdsourcing was a mistake. Crowdsourcing has failed, and the public should never help the police again…which is the exact incorrect conclusion to draw from this case.

I pause to point out that yes, once the FBI knew the proper identities, the FBI was certainly right to tell the crowd to stop searching and take down any photos that did not include the correct suspects. And the crowd’s desire to search for personal information (such as in poor Sunil’s case), instead of limiting themselves to spotting suspicious patterns in photos, was a terrible mistake. But it’s important to note that several in the crowd pointed out such errors and did their best to limit these crowd instincts. Cases such as Richard Jewell were duly noted and cited. Rather than suggesting that crowdsourcing as a whole is to blame, let’s look more closely at when crowdsourcing is most effective, and whether crowdsourcing was given a fair trial.

First, effectiveness. Crowdsourcing was indeed a great idea early in this case. I stress the mention of early. In my opinion, crowdsourcing is best when (1) members of the crowd may have novel sources of information not available to the original team, (2) the original team can fully share information with the crowd, and (3) the amount of information to sort through is massive. The Boston Marathon case had (1) and (3), but not (2). In the first day or two after the bombing, crowdsourcing could uncover photos taken by private citizens that the FBI did not have access to, or find patterns that had not yet been uncovered. Once the FBI had some leads, leads that they understandably could not share with the public at first, (2) meant that the crowdsourcing could no longer be fully effective. But until that point, crowdsourcing still made a great deal of sense because of (1) and (3).

Second, innovation experiments cannot thrive under a high-pressure environment that only rewards a correct outcome and penalizes any errors. The Harvard Business Review articles on failure are highly recommended as background reading. Paradoxically, the media, while busy making their own wild guesses and speculations, seemed almost offended that a crowd would venture to do the same thing. Aldrich and Fiol, in their well-regarded 1994 paper (and also Rao’s 1994 paper on the automobile industry), describe sociopolitical legitimacy as endorsement by legal authorities, governmental bodies, and other powerful organizations. In this case, the media was influential in de-legitimizing crowdsourcing in the eyes of ordinary citizens. Unfortunately, Internet crowdsourcing efforts were immediately criticized by the Atlantic, Slate, and other influential media bodies. Unscrupulous media organizations such as the New York Post ripped off various ideas the crowd came up with and trumpeted their scoops, only to cheerfully blame “the Internets” when those ideas were incorrect. Crowdsourcing can’t be effective in a fishbowl environment, as it takes time to discard bad ideas and elevate good ones.

There are good reasons to beware of crowdsourcing in criminal cases. However, based on my paragraph about effectiveness, I do believe the Boston Marathon Bombing case was a rare situation where private citizens may initially have had access to visual data that was previously unknown by police, and where large amounts of information made more analyst eyeballs valuable. Crowdsourcing by Reddit and other sites faltered when some crowd members attempted to be both judge and jury. Or if you prefer, detective and arresting officer. However, that should not mean that crowdsourcing itself was a failure, and it does our national and personal security a grave injury if this means citizens are now more reluctant to help police efforts.

When can a researcher detect the end of a trend? The concept of “Web 2.0″, focusing on the web as an interactive space between users rather than the straightforward providing of information, continues to dominate much of Internet thought. Companies have rushed to provide users with many different options to interact with or co-create for the company. But, a variety of other trends seem to indicate that less interactivity is on its way. A few examples:

1) Option overload. As interactivity has become more important, the tools and options have also improved and increased. Forget the comment section of yesteryear: now there are polls, forms, and platforms. But the satire in The Onion points out that many customers feel overwhelmed. Interactivity has gone from special sauce to main course, and that doesn’t match the priorities of many users. From an operations perspective, expanding options should result in better solutions: but in practice, users become overwhelmed and can’t take full advantage of the extra options.

2. Rise of mobile and tablet computing. Interaction requires the user to be able to quickly and easily navigate a text input device. This prerequisite is at times under-appreciated. (I would love to see statistics on how many Internet users fail to interact as much simply because of their inadequate typing skills). Mobile phones and tablets are accounting for an increasing amount of Internet traffic, yet it can be difficult to type a more lengthy, meaningful comment on such devices, or even to take a survey. The frustrated user will thus decline interaction opportunities.

3. Rewards for Interactivity versus Content Creation. Suppose that you excel in creating animated GIF’s. You could use that skill to make art in your favorite blog’s comment section, thus getting some approval from other readers or the blog writer. Or, you could start your own blog. Given the improvement of services such as Google Adsense or Youtube Partnership to market and monetize your blog, the jump from participating in a conversation to hosting it is a small one. Also, users are more aware that companies are profiting from their interactivity, and thus will demand more for the (dubious) privilege.

4. Misfit between Interaction Technology and Interaction Utility.
Reader, when was the last time you truly were impressed by the design of an interaction opportunity? Good interaction design shows the user the value of interaction before they interact, not after or never. Early adopters try a new technology just for variety’s sake. But to hold the attention of mainstream (early majority and late majority) customers, interaction technology has to bridge the gap between novelty and necessity.

In the end, I have more questions than answers for you. If I and others are correct, what will the less interactive Internet look like? Will it be dominated by design, and attempt to gain user information more by indirect rather than direct means? In a social welfare sense, is a less interactive Internet less optimal? Share your thoughts on interactivity…or in keeping with the post, explain why you would rather not interact.

After about a year or two of teaching at the Saunders College of Business, I decided to make a new rule for my class. I banned personal technology (cell phones, tablets, and computers) from my classroom except during approved groupwork sessions. I’ve been in some excellent discussions about this (Hi, Chris!) and wanted to discuss why I ban technology and what it would take for me to change my mind.

There are three good reasons why I might allow technology in my class. The first is to allow students to explore their own creative interests. The second is in the interests of personal freedom–students have a right to do as they please in class as long as it truly is not a distraction and they are still learning. The third is to allow students to take notes. Any others you can think of? Bear in mind that students can use technology during group exercises to look up info on companies.

I banned technology mainly because of the following reasons, however:

I. Distractions for students and those around them Have you ever had the experience of being at a meeting, checking your phone, only to find a rather distracting or urgent text message that made it difficult to focus? Or, noticing someone else checking their phone, reflexively reached for yours? Technology usage is oddly contagious. The problem is, in a classroom of 40 people, a wave of technology checking means no one is really paying attention anymore. There’s also some interesting research showing that being distracted for just 30 seconds means it takes several minutes to refocus. And of course, don’t get me started on being subjected to random ringtones, beeps, and clicks in the classroom.

II. Technology envy and diffusion. I once watched with amusement, back when I allowed technology, as more and more laptops cropped up in the class over time. Students started bringing in laptops more because they could, rather than to improve learning. One concern about permitting technology in the classroom is it sets up a bit of a class distinction. Taking my notes on pen and paper doesn’t seem as fun compared to my neighbor’s iPad.

III Group participation and enforcement issues. Technology can be quite antisocial, and I feel like it makes students lose the group dynamic. I’ve seen students bring in a laptop and then seem to mentally check out of class, not wanting to participate in group exercises and seeming oblivious. Now do I need to tell the student to put their laptop away? Or check their screen? But what if they really are taking notes? Essentially, I don’t want to waste my classroom time feeling like I have to police technology usage so that other students might not be distracted by Runescape or Minecraft.

IV. The Average Student Dilemma.  I definitely agree that the top creative, responsible minds in my class could do quite well if I just said “For the next 10 minutes, look up anything you can on entrepreneurial finance.” I’ve wanted to devise a class for some time that would rely mostly on individual pacing and motivation. Technology would be a large part of that. However, I’ve found that when I try to get too creative in the classroom, a lot of the students get left behind. I think that a properly designed course that used technology might be great for the top 10%, but I feel that it would not necessarily benefit all students. It also means that the students don’t have a common knowledge base to draw on the way we do if we all work together on same material. Also, what common technology could I depend on everyone to have? Not everyone even has smartphones, let alone tablets. And even if I could mandate (such as clickers), I wouldn’t want to add extra cost.

Now, one development I’m keeping an eye on is the improvement of e-readers and tablets. IF (big if) it becomes clear that electronic devices are better than pen and paper for taking notes, I would definitely change my policy. I can see that day coming down the road, but I don’t think we are quite there yet. So then, those of you who permit free use of technology in the classroom, how has it been a blessing? Feel free to challenge my points.

Recently I watched the movie Moneyball, and thought about what it might mean for quantitatively-minded people. (Note: For the purpose of simplicity, I’ll temporarily pair together quantitatively-minded and operations research, although not everyone in one group would claim membership in the other). I watched it partly in the hope that a movie aimed at a mass audience would do a good job of explaining what it is “quants” do.

Given my own previous efforts in explanation, such a movie was sorely needed. For example, as a young PhD student, I still recall trying to tell my friend Rachel what it was I was getting myself into. She listened rather soberly to my detailed explanation of efficiency and improved logistics and then, with a twinkle in her eye, remarked “Oh, so you’re like a UPS deliveryman?” Clearly, my attempt to explain Operations Research had failed.

The movie did something better for me, though, than explain quantitative thinking. Midway through, Brad Pitt’s character, Billy Beane, asks Paul Brand, his assistant, if a certain trade is a good idea. Paul Brand, as I recall, worries that the trade will be difficult to explain to people. Then, the movie takes a rather odd turn. Billy Beane says not to worry about explaining. He instead asks Paul if he believes that the trade is a good one, and tells him something to the effect that if something is good, it does not need to be explained. Billy Beane’s reasoning, if applied to operations research, adds an intriguing flair to some old issues.

First, let’s consider the idea of something being so good it doesn’t need to be explained. So often, to gain respect for our field (and ourselves), we want to explain operations in simpler terms. We often are process-oriented in operations, so we want to explain the process. But in trying to prove operations, perhaps we instead should find ways to sharply focus on outcomes. Admittedly, perhaps our knowledge of randomness in outcomes restrains our ability to properly sell results. And sometimes it’s a little difficult to find “Before” and “After” shots of how operations improved things. The simplicity of winning and losing that sports offers seems enviable to many of us trying to prove the worth of operations in less distinct fields. But perhaps we should focus most on those areas where operations techniques defy and improve upon conventional wisdom, just as Billy Beane does in the movie. I believe the true problem is not proving the value of operations research, but proving that operations research is irreplaceable.

Second, consider what Billy Beane says about believing in one’s theories. Stretching this logic further, I wonder what would happen if, instead of “The Science of Better,” we referred to operations as “The Magic of Better?” I define magic very narrowly here, in the sense that operations properly applied often leads to surprising, unpredictable results. The audience doesn’t need to know how those results are attained. They just need to enjoy those results for the show to be a success. It’s enough that the magician, err, operations person know how it works.

I am making this suggestion mostly to generate conversation. I doubt that most operations professionals would enjoy being called magicians! But if you will, have patience with this little thought experiment. Would it help to gain more appreciation for quantitative techniques if we did so? Aren’t there times where operations itself seems rather magical?

We’ve all had that moment of proving a concept or finding an answer where we gasped a little as the pieces fell into place and our intuition was more rapid than our logic. This beauty of operations can be difficult to admit, but perhaps more fully embracing these qualities would be beneficial. Despite stereotypes that quantitative approaches are cold and clinical, there can be much warmth and beauty in these arts. The very best teachers and professionals I have worked with are the ones who found ways to translate this beauty across their specialty to others.

No, I don’t think INFORMS would ever support a “The Magic of Better” campaign, nor should they. But, how can you better translate the magic of your field so that others can enjoy the tricks you perform?

In The Long Tail, Chris Anderson uses the “long tail” as a statistical property to suggest that businesses can make a lot of money by selling items that only a few people want. Given the ease of selling items online, eventually a near-complete inventory of all a human being desires should be available online. Such a strategy is behind the success of firms such as Amazon and Netflix.

However, recently I was thinking of another type of long tail. In the classic tale of Moby Dick, Captain Ahab obsesses over the elusive white whale. No other whale will do. He perfunctorily catches a few whales to pass the time, but his real ambition is catching the white whale. The white whale is not searching for Ahab: no, his crew member tells him “Moby-Dick seeks thee not. It is thou, thou, that madly seekest him!” But still, Captain Ahab pushes onward, relentlessly, ignoring the advice of Starbuck to make a safe profit from other, less glamorous whales.

In an information-rich society, the temptation is indeed to chase the long tail to make the perfect haul. For example, I am considering whether to buy a tablet. However, given the already large variety of tablets, pricing, and options out there, I find myself being inspired to search ever further for the perfect tablet. Why compromise when the perfect tablet is out there? (I can sadly report that clicking on those FREE IPAD ads hasn’t worked. I was sure I was the 1000000 visitor!). As a result, I suspect my first tablet purchase will be the iPad 30.

Or, take an area which supposedly has been revolutionized by the web: dating. You’ve no doubt heard much about how personality testing and demographically-oriented sites have revolutionized dating. In older times, one might have compromised on certain features of their list. The Buddhist scientist from Liechtenstein who lives in a small town in Indiana would have eventually realized that she was unlikely to find a man with exactly similar characteristics, and might have “settled” for a farmer. I put settled in quotes, however, because search time itself exerts a cost. Is she really better off scouring personal sites with quirky names such as LiechMeansLove (groan) in search of the man who matches her unique demographic?

In general, is the search for the far end of the long tail truly a benefit, or is it a curse as we mutter to ourselves (with apologies to Sculder and Mully) “The deal is out there” while refreshing Google madly? I don’t claim that this observation is particularly novel, but I am reminded yet again of the need to satisfice (or sacrifice, if you will). For those trained in math or the sciences, we have an even greater emphasis on optimization. However, it’s those hidden, hard-to-evaluate costs that throw off even the best formulas. What costs do you think we are missing as we as customers become ever more precise in what we demand?

Also, I’d like to wish all education folk a happy return to the classroom!

We’ve all been there…waiting not-so-patiently at a wedding for event X to occur while devouring enough crackers and cheese for a flock of parrots. Recently, I attended a wedding of a good college friend (Congrats Scott and Jenna!), and ended up getting involved behind the scenes to make sure the reception stayed on track. Here are my best attempts to think about how process management and operations could help make your wedding fun.

Read more

A Very Operations Christmas

By: johnangelis
Filed Under: Uncategorized | 5 Comments

In response to this challenge, I’ve decided to reflect on how my training in Operations could assist me in preparing for the Christmas holidays. I find it fascinating when business and mathematical concepts can also be applied to other parts of everyday life.

Plan backwards, not forwards (Mathematical Programming, Project Management): For those of us who plan for Christmas, we tend to plan forward according to time slots we have available. Thus we may shop for presents the weekend before Christmas, and plan to buy the things needed for Christmas on the 23rd. However, this approach can cause us to make big mistakes, as we don’t notice that before we can, say, shop for presents, we need to get a list of demands wants, from the gift recipients. Also, this year Christmas falls on a weekend, so plans to shop for presents the weekend before can quickly fall apart.

Instead, consider using an operations technique and plan backwards. Start at Christmas Day and ask yourself “In order to have a good Christmas, what will I need to have at the 25th?” Then move backwards to the 24th, 23rd, etc., and on each day label what you need to do in order to meet your Christmas goals.

Focus on Processes, not Outcomes (Manufacturing, Quality Control): Let’s face it. We’ve all had excellent Christmas’ ruined by one malcontent who happens to be related to us, or by a stubborn turkey that even in death resisted the oven’s demands to become palatable. Randomness happens to us all, and there’s no way to make sure that the gift recipient will appreciate his present.

However, we can at least make sure that we have done our part to minimize variation and randomness ahead of time. Check for obvious disasters in advance (Christmas morning is not the day to check if your oven still works, for example). Make a list of people you may want to buy presents for. If you intend to attend a church service, make sure it is being held on the day you think it is (some churches only hold a Christmas Eve service, for example). All you can do is set the table for a lovely Christmas; what happens when everyone takes a seat at the table is naturally out of your control. Accepting that life will always have an element of stochasticity to it can be surprisingly uplifting.

Optimize according to your Primary Goal (Mathematical Programming). What is the real meaning of Christmas for you? Is your goal to provide a lovely time of community for your family? To rest from your labors for the year and refresh to prepare for the next year? To remember the birth of your Christian Savior and Lord? Whichever it is, make your decisions based on that priority. There are parts of the Christmas tradition I leave out or ignore (sorry about the presents, Dad! kidding), because they don’t fit in with my main priority. If you don’t know what you want…it’s difficult to get it. So why not take just five minutes after reading this email and think about what it is you want out of this time?

Yesterday I attended a conference on Social Media and Communication at RIT. I have a strong interest in social media and thus hoped to find out the latest news from a Marketing and Journalistic perspective. You can read my short, random thoughts on the conference in more detail on my Twitter feed (search for hashtag #smacsrit as well for a better perspective).

EDIT: I’ve made some corrections and additions and added a 6th point for the afternoon crowd. Please feel free to engage me on these points in the comment section with your own observations.

My favorite moment of all was being introduced via Deirdre Breakenridge‘s talk to this excellent graphic from Forrester Research on Internet roles. If you are pressed for time, go look at it and skip the rest of this post. It’s a valuable ontology of the social media user base.

Forrester Social Technographics

Here are five (EDIT: Six!) takeaways that left me thinking after the conference was done:

1. The actual utility of Social Media is still unknown. Malcolm Gladwell wrote a quietly biting essay subtitled “Why the Revolution will not be Tweeted” that I unfortunately had to agree with. Yes, advertisers have bought into social media, by and large, because a large audience lives there. And there are plenty of anecdotes as to individual social media successes. On the light-hearted side, I myself once introduced two friends to one another via social media, and they are engaged now. But I left from the conference still thinking that gaining loyalty of followers and viewers, let alone gaining their wallets, is still a very uncertain science with poorly designed metrics.

2. Firms are oddly reluctant to directly hire Web 2.0 talent. I was intrigued by Jenny Cisney‘s story about how she ended up blogging for Kodak [EDIT: Original version said Xerox, my mistake.]. Essentially, she was the only known blogger at Kodak, so she was tasked to blog for Kodak, and became their chief blogger. Just a little prior Web 2.0 experience makes a big difference in avoiding rookie mistakes and knowing how to promote content online. Instead of relying on a business consultant who 10 years ago was pushing ERP, why not actually hire people who have been a success on sites such as Youtube and WordPress as well? Many of them are young and rather affordable, I believe.

3. Who influences the Passive Consumer of Social Media? On the Web, often the loudest voices get heard. People pay the most attention to the most ridiculous comments in the Democrat and Chronicle comment section, correct? However, what do we know about the stay-at-home mother of two who quietly reads the website every day,  but never leaves a comment and never clicks on an ad? This type of reader, who quietly absorbs social media but never even updates their own Facebook status except to announce their engagement or death of a family member, tends to get overlooked. Yet they outnumber content creators and commenters.

4. For being so innovative, Web 2.0 can have all the quirks of the “Old Boys Network” that it supposedly replaced. EDIT: I did not know full details in first draft, my apologies. Re-written for clarity.

One of the stories told during a panel discussion helped reinforce some of my own experiences on how one gets ahead in Web 2.0. In my opinion, just having good content is not enough and is often over-rated. Eric Miltsch, who is the Internet director for Auctiondirectusa.com, pointed out that a major key to his success was merely interviewing and interacting with people who were more popular than he was on Web 2.0, and thus benefiting from their reputations. (Please see his comment below under this blog for additional insight. My original draft made it seem a bit too easy, but I also had wanted to stress the critical importance of Web 2.0 networking to success).  Obviously, if you are boring or bring no value to these influences and leaders, you will get no benefit from interacting with them.  However, my point is, for all the talk about search and Wikipedia supposedly bringing knowledge to the masses, you still have to work with the influencers in order to get noticed. Also, it is necessary to properly separate early adopters from innovators from lead users; each one plays a vital role. Look up the details…or I’d tell you more about this if you take my class at RIT (Cross-promotion is also very important on Web 2.0.)

5. Hybrid, balanced teams will own Web 2.0. My experienced colleague, Dr. John Ettlie, is studying the performance of teams in his classes. For certain problems, he finds that balanced teams, which contain a mix of business students and engineering students, are most successful. I got the same feeling about what it takes to be successful when watching the presentations. An artist may create beautiful visual content, but may lack the promotional and technical skills to get it noticed online. A business person may know good content that customers like when they see it, but struggle in creating content themselves or understanding the mindset of a Web 2.0 content producer. I believe that too many  Web 2.0 efforts are led by individuals, when the key is to have a balanced team of content creators, content promotors, and technical specialists. Firms that construct such teams and are willing to go outside their comfort zone to, say, hire a 19-year-old Youtube star to create promotional videos full-time at their firm, will own Web 2.0…once we figure out via #1 why it’s worth owning!

Bonus! #6 Being a curator of best content and creating original content is not an either/or choice. Unfortunately, too many companies and blogs tend to focus on one of these two roles. Either they work as curators to collect the best content elsewhere on the web and link to it, or they work as content creators and post their own original talent. But it’s not an either-or choice.

First, a company can have its own blog writers instead of relying on outside content. Too many companies delegate such responsibilities to interns thinking that somehow the very act of being young conveys with it technical and Web 2.0 expertise. (It’s a fallacy I must devote an entire blog post to in the future.) However, a wise company will actually have several bloggers writing daily, then create one page on which to feature the best blog post of the day from all writers. This is why the curator role is so important internally as well as externally. It is much easier to give your people total freedom initially, then use an editor or curator to control what is released to the public.

Second, companies need to be less concerned about who they link to and be willing to accept outside content. From a competitive standpoint, if you fail to provide searchers with useful information in a category they desire, they will go elsewhere to find it. Why not instead link to that information from your page, so that other companies cannot make you pay for the gap in what you provide? Naturally, such cooperation can go too far. But in my studies of cooperative game theory and reading books such as Co-opetition , I’ve discovered that there are many more opportunities for cooperation than you may think. Gaining trust and being respected as a source of information can lead to future sales. It’s a bit hard to quantify this, and I return you to my point #1, again, but I believe it.

Today in my Managing Innovation and Technology class, we were discussing the 5 kinds of technology adopters according to Rogers’ typology. At first, only innovators and early adopters purchase the new high-tech product. The innovators are the type of customer so excited about technology they might even tape themselves unwrapping the technology itself. (It’s a trend known as “unboxing”, see example here.)

Subsequently,a company faces the challenge of convincing the early majority to adopt the new product. Geoffrey Moore’s “Crossing the Chasm” book is built around this concept and is worth a read. Finally, the company attracts the late majority and laggard segments by cutting price and/or making the product more reliable. See below, via Craig Chelius/Geoffrey Moore:

What fascinates me today is what happens when a technology becomes so ubiquitous that not having the technology makes you odd. While promoting a new album, the rapper Eminem admitted he did not know how to use a computer. This prompted such sensationalistic blog headlines as “Shock of the Century” from astounded bloggers. We’re now at a point where someone whose career does not require computer usage is now insulted for not being computer illiterate. There’s a great philosophical discussion to be had about what happens when our culture separates into tech haves and have nots, but I digress.

The question for today is, as a business owner, how can you take advantage of such tech ubiquity? I wanted to present three rather counter-intuitive ideas on potential strategies and get your thoughts on each. I stress that these are merely intended to start a discussion and are not necessarily a good idea for all firms. These are intended as conversation-starters rather than recommendations.

1. When a technology has become ubiquitous, completely neglect the opinions of innovators and early adopters. But wait, isn’t this a good source of new ideas? Still, consider this. Innovators and early adopters become more demanding as time goes on. Yesterday’s hot technology is today’s ho-hum customer staple. Because early adopters often are thought leaders, companies still try to stay in favor with them. However, the costs of pleasing sophisticated, ever-more demanding customers for a ubiquitous technology may become unbearable for many firms. If summer blockbusters can still become wildly popular despite the dismay of many a movie critic, why not your product?

2. Make sure fundamental aspects of  your product or service do not solely rely on ubiquitous technology. So often, it is tempting to think that ubiquitous technology cannot be replaced, and to lean heavily on existing platforms. However, in the last decade once ubiquitous technology such as analog TV or the VCR have been replaced by new standards. Shorter product life cycles make it easier for companies to correct betting on the wrong technology,but ask someone trapped using DOS software at work how difficult it can be to get rid of existing products built on old technology!

Besides the product being sold, consider customer service and marketing. Companies are emphasizing websites much more in their advertising campaigns, but more than 20% of Americans still do not have access to the Internet. It is tempting for tech-savvy office workers to just assume that customers will be able to access their website, or receive information via text or phone. But doing so leaves some customers unable to participate. There is still a strong no-tech/low-tech market out there that will appreciate simpler options in customer service and marketing.

3. Consider designing new products to target Late Majority and Laggard adopter groups. Amusingly, as technology becomes ubiquitous, these two groups are now the new customers just as innovators were once the new customers. It’s tempting to see these groups as targets that need to be supported and cajoled into accepting the basic technology that the other adopters already know and love. However, consider, for example, the appeal of the iPad to senior citizens, many of whom were reluctant or unable to use computers. Now of course, the iPad was not designed specifically with seniors in mind. But I think there’s a key principle here that can be gleaned from this story.

Sometimes, rather than developing simpler versions of existing technology to try to coax already-burned laggards back into the market, the answer is to try completely new products built on different core concepts. This is not quite targeting segment zero, nor is it a discontinuous technology per se, although elements of both are present in such a strategy. Rather, it is using the needs of the least technologically skilled users (“follower users”, if you will, as opposed to lead users) to drive innovation.

Any thoughts on those loosely-sketched ideas and the companies/industries that would best benefit by putting them into practice? How do you feel about technology ubiquity as a whole?

Comments