About espirl

eSpirl has been an integral part in to develop online Communities of practice. espirl delivers campaigns designed to activate passionate users in online communities of practice. It develops initiatives to promote the community of practices. The initiatives are measurable, predictable, and scalable. it promotes dialogue and innovation. We develop and distribute information that’s consistent, relevant and provides value by broadening the community of practices. Involved in development of a large targeted online subscriber base via online communications strategy and detail communication plans. Incorporates social networking elements as part of its very fabric.

Five routes to more innovative problem solving

Five routes to more innovative problem solving

Tricky problems must be shaped before they can be solved. To start that process, and stimulate novel thinking, leaders should look through multiple lenses.

Rob McEwen had a problem. The chairman and chief executive officer of Canadian mining group Goldcorp knew that its Red Lake site could be a money-spinner—a mine nearby was thriving—but no one could figure out where to find high-grade ore. The terrain was inaccessible, operating costs were high, and the unionized staff had already gone on strike. In short, McEwen was lumbered with a gold mine that wasn’t a gold mine.

Then inspiration struck. Attending a conference about recent developments in IT, McEwen was smitten with the open-source revolution. Bucking fierce internal resistance, he created the Goldcorp Challenge: the company put Red Lake’s closely guarded topographic data online and offered $575,000 in prize money to anyone who could identify rich drill sites. To the astonishment of players in the mining sector, upward of 1,400 technical experts based in 50-plus countries took up the problem. The result? Two Australian teams, working together, found locations that have made Red Lake one of the world’s richest gold mines. “From a remote site, the winners were able to analyze a database and generate targets without ever visiting the property,” McEwen said. “It’s clear that this is part of the future.”

McEwen intuitively understood the value of taking a number of different approaches simultaneously to solving difficult problems. A decade later, we find that this mind-set is ever more critical: business leaders are operating in an era when forces such as technological change and the historic rebalancing of global economic activity from developed to emerging markets have made the problems increasingly complex, the tempo faster, the markets more volatile, and the stakes higher. The number of variables at play can be enormous, and free-flowing information encourages competition, placing an ever-greater premium on developing innovative, unique solutions.

This article presents an approach for doing just that. How? By using what we call flexible objects for generating novel solutions, or flexons, which provide a way of shaping difficult problems to reveal innovative solutions that would otherwise remain hidden. This approach can be useful in a wide range of situations and at any level of analysis, from individuals to groups to organizations to industries. To be sure, this is not a silver bullet for solving any problem whatever. But it is a fresh mechanism for representing ambiguous, complex problems in a structured way to generate better and more innovative solutions.

The flexons approach

Finding innovative solutions is hard. Precedent and experience push us toward familiar ways of seeing things, which can be inadequate for the truly tough challenges that confront senior leaders. After all, if a problem can be solved before it escalates to the C-suite, it typically is. Yet we know that teams of smart people from different backgrounds are more likely to come up with fresh ideas more quickly than individuals or like-minded groups do.2 When a diverse range of experts—game theorists to economists to psychologists—interact, their approach to problems is different from those that individuals use. The solution space becomes broader, increasing the chance that a more innovative answer will be found.

Obviously, people do not always have think tanks of PhDs trained in various approaches at their disposal. Fortunately, generating diverse solutions to a problem does not require a diverse group of problem solvers. This is where flexons come into play. While traditional problem-solving frameworks address particular problems under particular conditions—creating a compensation system, for instance, or undertaking a value-chain analysis for a vertically integrated business—they have limited applicability. They are, if you like, specialized lenses. Flexons offer languages for shaping problems, and these languages can be adapted to a much broader array of challenges. In essence, flexons substitute for the wisdom and experience of a group of diverse, highly educated experts.

To accommodate the world of business problems, we have identified five flexons, or problem-solving languages. Derived from the social and natural sciences, they help users understand the behavior of individuals, teams, groups, firms, markets, institutions, and whole societies. We arrived at these five through a lengthy process of synthesizing both formal literatures and the private knowledge systems of experts, and trial and error on real problems informed our efforts. We don’t suggest that these five flexons are exhaustive—only that we have found them sufficient, in concert, to tackle very difficult problems. While serious mental work is required to tailor the flexons to a given situation, and each retains blind spots arising from its assumptions, multiple flexons can be applied to the same problem to generate richer insights and more innovative solutions.

Networks flexon

Imagine a map of all of the people you know, ranked by their influence over you. It would show close friends and vague acquaintances, colleagues at work and college roommates, people who could affect your career dramatically and people who have no bearing on it. All of them would be connected by relationships of trust, friendship, influence, and the probabilities that they will meet. Such a map is a network that can represent anything from groups of people to interacting product parts to traffic patterns within a city—and therefore can shape a whole range of business problems.

For example, certain physicians are opinion leaders who can influence colleagues about which drugs to prescribe. To reveal relationships among physicians and help identify those best able to influence drug usage, a pharmaceutical company launching a product could create a network map of doctors who have coauthored scientific articles. By targeting clusters of physicians who share the same ideas and (one presumes) have tight interactions, the company may improve its return on investments compared with what traditional mass-marketing approaches would achieve. The network flexon helps decompose a situation into a series of linked problems of prediction (how will ties evolve?) and optimization (how can we maximize the relational advantage of a given agent?) by presenting relationships among entities. These problems are not simple, to be sure.3 But they are well-defined and structured—a fundamental requirement of problem solving.

Evolutionary flexon

Evolutionary algorithms have won games of chess and solved huge optimization problems that overwhelm most computational resources. Their success rests on the power of generating diversity by introducing randomness and parallelization into the search procedure and quickly filtering out suboptimal solutions. Representing entities as populations of parents and offspring subject to variation, selection, and retention is useful in situations where businesses have limited control over a large number of important variables and only a limited ability to calculate the effects of changing them, whether they’re groups of people, products, project ideas, or technologies. Sometimes, you must make educated guesses, test, and learn. But even as you embrace randomness, you can harness it to produce better solutions to complex problems.

That’s because not all “guessing strategies” are created equal. We have crucial choices to make: generating more guesses (prototypes, ideas, or business models) or spending more time developing each guess or deciding which guesses will survive. Consider a consumer-packaged-goods company trying to determine if a new brand of toothpaste will be a hit or an expensive failure. Myriad variables—everything from consumer habits and behavior to income, geography, and the availability of clean water—interact in multiple ways. The evolutionary flexon may suggest a series of low-cost, small-scale experiments involving product variants pitched to a few well-chosen market segments (for instance, a handful of representative customers high in influence and skeptical about new ideas). With every turn of the evolutionary-selection crank, the company’s predictions will improve.

Decision-agent flexon

To the economic theorist, social behavior is the outcome of interactions among individuals, each of whom tries to select the best possible means of achieving his or her ends. The decision-agent flexon takes this basic logic to its limit by providing a way of representing teams, firms, and industries as a series of competitive and cooperative interactions among agents. The basic approach is to determine the right level of analysis—firms, say. Then you ascribe to them beliefs and motives consistent with what you know (and think they know), consider how their payoffs change through the actions of others, determine the combinations of strategies they might collectively use, and seek an equilibrium where no agent can unilaterally deviate from the strategy without becoming worse off.

Game theory is the classic example, but it’s worth noting that a decision-agent flexon can also incorporate systematic departures from rationality: impulsiveness, cognitive shortcuts such as stereotypes, and systematic biases. Taken as a whole, this flexon can describe all kinds of behavior, rational and otherwise, in one self-contained problem-solving language whose most basic variables comprise agents (individuals, groups, organizations) and their beliefs, payoffs, and strategies.

For instance, financial models to optimize the manufacturing footprint of a large industrial company would typically focus on relatively easily quantifiable variables such as plant capacity and input costs. To take a decision-agent approach, you assess the payoffs and likely strategies of multiple stakeholders—including customers, unions, and governments—in the event of plant closures. Adding the incentives, beliefs, and strategies of all stakeholders to the analysis allows the company to balance the trade-offs inherent in a difficult decision more effectively.

System-dynamics flexon

Assessing a decision’s cascading effects on complex businesses is often a challenge. Making the relations between variables of a system, along with the causes and effects of decisions, more explicit allows you to understand their likely impact over time. A system-dynamics lens shows the world in terms of flows and accumulations of money, matter (for example, raw materials and products), energy (electrical current, heat, radio-frequency waves, and so forth), or information. It sheds light on a complex system by helping you develop a map of the causal relationships among key variables, whether they are internal or external to a team, a company, or an industry; subjectively or objectively measurable; or instantaneous or delayed in their effects.

Consider the case of a deep-sea oil spill, for example. A source (the well) emits a large volume of crude oil through a sequence of pipes (which throttle the flow and can be represented as inductors) and intermediate-containment vessels (which accumulate the flow and can be modeled as capacitors). Eventually, the oil flows into a sink (which, in this case, is unfortunately the ocean). A pressure gradient drives the flow rate of oil from the well into the ocean. Even an approximate model immediately identifies ways to mitigate the spill’s effects short of capping the well. These efforts could include reducing the pressure gradient driving the flow of crude, decreasing the loss of oil along the pipe, increasing the capacity of the containment vessels, or increasing or decreasing the inductance of the flow lines. In this case, a loosely defined phenomenon such as an oil spill becomes a set of precisely posed problems addressable sequentially, with cumulative results.

Information-processing flexon

When someone performs long division in her head, a CEO makes a strategic decision by aggregating imperfect information from an executive team, or Google servers crunch Web-site data, information is being transformed intelligently. This final flexon provides a lens for viewing various parts of a business as information-processing tasks, similar to the way such tasks are parceled out among different computers. It focuses attention on what information is used, the cost of computation, and how efficiently the computational device solves certain kinds of problems. In an organization, that device is a collection of people, whose processes for deliberating and deciding are the most important explanatory variable of decision-making’s effectiveness.4

Consider the case of a private-equity firm seeking to manage risk. A retrospective analysis of decisions by its investment committee shows that past bets have been much riskier than its principals assumed. To understand why, the firm examines what information was transmitted to the committee and how decisions by individuals would probably have differed from those of the committee, given its standard operating procedures. Interviews and analysis show that the company has a bias toward riskier investments and that it stems from a near-unanimity rule applied by the committee: two dissenting members are enough to prevent an investment. The insistence on near-unanimity is counterproductive because it stifles debate: the committee’s members (only two of whom could kill any deal) are reluctant to speak first and be perceived as an “enemy” by the deal sponsor. And the more senior the sponsor, the more likely it is that risky deals will be approved. Raising the number of votes required to kill deals, while clearly counterintuitive, would stimulate a richer dialogue.

Putting flexons to work

We routinely use these five problem-solving lenses in workshops with executive teams and colleagues to analyze particularly ambiguous and complex challenges. Participants need only a basic familiarity with the different approaches to reframe problems and generate more innovative solutions. Here are two quite different examples of the kinds of insights that emerge from the use of several flexons, whose real power emerges in combination.

Reorganizing for innovation

A large biofuel manufacturer that wants to improve the productivity of its researchers can use flexons to illuminate the problem from very different angles.

Networks. It’s possible to view the problem as a need to design a better innovation network by mapping the researchers’ ties to one another through co-citation indices, counting the number of e-mails sent between researchers, and using a network survey to reveal the strength and density of interactions and collaborative ties. If coordinating different knowledge domains is important to a company’s innovation productivity, and the current network isn’t doing so effectively, the company may want to create an internal knowledge market in which financial and status rewards accrue to researchers who communicate their ideas to co-researchers. Or the company could encourage cross-pollination by setting up cross-discipline gatherings, information clearinghouses, or wiki-style problem-solving sites featuring rewards for solutions.

Evolution. By describing each lab as a self-contained population of ideas and techniques, a company can explore how frequently new ideas are generated and filtered and how stringent the selection process is. With this information, it can design interventions to generate more varied ideas and to change the selection mechanism. For instance, if a lot of research activity never seems to lead anywhere, the company might take steps to ensure that new ideas are presented more frequently to the business-development team, which can provide early feedback on their applicability.

Decision agents. We can examine in detail how well the interests of individual researchers and the organization are aligned. What financial and nonfinancial benefits accrue to individuals who initiate or terminate a search or continue a search that is already under way? What are the net benefits to the organization of starting, stopping, or continuing to search along a given trajectory? Search traps or failures may be either Type I (pursuing a development path unlikely to reach a profitable solution) or Type II (not pursuing a path likely to reach a profitable solution). To better understand the economics at play, it may be possible to use industry and internal data to multiply the probabilities of these errors by their costs. That economic understanding, in turn, permits a company to tailor incentives for individuals to minimize Type I errors (by motivating employees to reject apparent losers more quickly) or Type II errors (by motivating them to persist along paths of uncertain value slightly longer than they normally would).

Predicting the future

Now consider the case of a multinational telecommunications service provider that operates several major broadband, wireless, fixed, and mobile networks around the world, using a mix of technologies (such as 2G and 3G). It wants to develop a strategic outlook that takes into consideration shifting demographics, shifting technologies for connecting users with one another and with its core network (4G), and shifting alliances—to say nothing of rapidly evolving players from Apple to Qualcomm. This problem is complicated, with a range of variables and forces at work, and so broad that crafting a strategy with big blind spots is easy. Flexons can help.

Each view of the world described below provides valuable food for thought, including potential strategic scenarios, technology road maps, and possibilities for killer apps. More hard work is needed to synthesize the findings into a coherent worldview, but the different perspectives provided by flexons illuminate potential solutions that might otherwise be missed.

Decision agents. Viewing the problem in this way emphasizes the incentives for different industry players to embrace new technologies and service levels. By enumerating a range of plausible scenarios from the perspective of customers and competitors, the network service provider can establish baseline assessments of future pricing, volume levels, and investment returns.

Networks. This lens allows a company or its managers to look at the industry as a pattern of exchange relationships between paying customers and providers of services, equipment, chips, operating systems, and applications, and then to examine the properties of each exchange network. The analysis may reveal that not all innovations and new end-user technologies are equal: some provide an opportunity for differentiation at critical nodes in the network; others do not.

System dynamics. This flexon focuses attention on data-flow bottlenecks in applications ranging from e-mail and voice calls to video downloads, games, and social-networking interactions.5 The company can build a network-optimization map to predict and optimize capital expenditures for network equipment as a function of expected demand, information usage, and existing constraints. Because cost structures matter deeply to annuity businesses (such as those of service providers) facing demand fluctuations, the resulting analysis may radically affect which services a company believes it can and cannot offer in years to come.

Flexons help turn chaos into order by representing ambiguous situations and predicaments as well-defined, analyzable problems of prediction and optimization. They allow us to move up and down between different levels of detail to consider situations in all their complexity. And, perhaps most important, flexons allow us to bring diversity inside the head of the problem solver, offering more opportunities to discover counterintuitive insights, innovative options, and unexpected sources of competitive advantage.

Advertisements

Gartner Hype Cycle for Emerging Technologies, 2013

Gartner’s 2013 Hype Cycle for Emerging Technologies Maps Out Evolving Relationship Between Humans and Machines

2013 Hype Cycle Special Report Evaluates the Maturity of More Than 1,900 Technologies

Gartner to Host Complimentary Webinar “Emerging Technologies Hype Cycle for 2013: Redefining the Relationship,” August 21 at 10 a.m. EDT and 1 p.m. EDT

The evolving relationship between humans and machines is the key theme of Gartner, Inc.’s “Hype Cycle for Emerging Technologies, 2013.” Gartner has chosen to feature the relationship between humans and machines due to the increased hype around smart machines, cognitive computing and the Internet of Things. Analysts believe that the relationship is being redefined through emerging technologies, narrowing the divide between humans and machines.

Gartner’s 2013 Hype Cycle Special Report provides strategists and planners with an assessment of the maturity, business benefit and future direction of more than 2,000 technologies, grouped into 98 areas. New Hype Cycles this year include content and social analytics, embedded software and systems, consumer market research, open banking, banking operations innovation, and information and communication technology (ICT) in Africa.

The Hype Cycle for Emerging Technologies report is the longest-running annual Hype Cycle, providing a cross-industry perspective on the technologies and trends that senior executives, CIOs, strategists, innovators, business developers and technology planners should consider in developing emerging-technology portfolios.

“It is the broadest aggregate Gartner Hype Cycle, featuring technologies that are the focus of attention because of particularly high levels of hype, or those that Gartner believes have the potential for significant impact,” said Jackie Fenn, vice president and Gartner fellow.

“In making the overriding theme of this year’s Hype Cycle the evolving relationship between humans and machines, we encourage enterprises to look beyond the narrow perspective that only sees a future in which machines and computers replace humans. In fact, by observing how emerging technologies are being used by early adopters, there are actually three main trends at work. These are augmenting humans with technology — for example, an employee with a wearable computing device; machines replacing humans — for example, a cognitive virtual assistant acting as an automated customer representative; and humans and machines working alongside each other — for example, a mobile robot working with a warehouse employee to move many boxes.”

“Enterprises of the future will use a combination of these three trends to improve productivity, transform citizen and customer experience, and to seek competitive advantage,” said Hung LeHong, research vice president at Gartner. “These three major trends are made possible by three areas that facilitate and support the relationship between human and machine. Machines are becoming better at understanding humans and the environment — for example, recognizing the emotion in a person’s voice — and humans are becoming better at understanding machines — for example, through the Internet of things. At the same time, machines and humans are getting smarter by working together.”

Figure 1. Hype Cycle for Emerging Technologies, 2013

Gartner Hype Cycles 2013

Source: Gartner August 2013

The 2013 Emerging Technologies Hype Cycle highlights technologies that support all six of these areas including:

1. Augmenting humans with technology

Technologies make it possible to augment human performance in physical, emotional and cognitive areas. The main benefit to enterprises in augmenting humans with technology is to create a more capable workforce. For example, consider if all employees had access to wearable technology that could answer any product or service question or pull up any enterprise data at will. The ability to improve productivity, sell better or serve customer better will increase significantly. Enterprises interested in these technologies should look to bioacoustic sensing, quantified self, 3D bioprinting, brain-computer interface, human augmentation, speech-to-speech translation, neurobusiness, wearable user interfaces, augmented reality and gesture control.

2. Machines replacing humans

There are clear opportunities for machines to replace humans: dangerous work, simpler yet expensive-to-perform tasks and repetitive tasks. The main benefit to having machines replace humans is improved productivity, less danger to humans and sometimes better quality work or responses. For example, a highly capable virtual customer service agent could field the many straightforward questions from customers and replace much of the customer service agents’ “volume” work — with the most up-to-date information. Enterprises should look to some of these representative technologies for sources of innovation on how machines can take over human tasks: volumetric and holographic displays, autonomous vehicles, mobile robots and virtual assistants.

3. Humans and machines working alongside each other

Humans versus machines is not a binary decision, there are times when machines working alongside humans is a better choice. A new generation of robots is being built to work alongside humans. IBM’s Watson does background research for doctors, just like a research assistant, to ensure they account for all the latest clinical, research and other information when making diagnoses or suggesting treatments. The main benefits of having machines working alongside humans are the ability to access the best of both worlds (that is, productivity and speed from machines, emotional intelligence and the ability to handle the unknown from humans). Technologies that represent and support this trend include autonomous vehicles, mobile robots, natural language question and answering, and virtual assistants.

The three trends that will change the workforce and the everyday lives of humans in the future are enabled by a set of technologies that help both machine and humans better understand each other. The following three areas are a necessary foundation for the synergistic relationships to evolve between humans and machines:

4. Machines better understanding humans and the environment

Machines and systems can only benefit from a better understanding of human context, humans and human emotion. This understanding leads to simple context-aware interactions, such as displaying an operational report for the location closest to the user; to better understanding customers, such as gauging consumer sentiment for a new product line by analyzing Facebook postings; to complex dialoguing with customers, such as virtual assistants using natural language question and answering to interact on customer inquiries. The technologies on this year’s Hype Cycle that represent these capabilities include bioacoustic sensing, smart dust, quantified self, brain computer interface, affective computing, biochips, 3D scanners, natural-language question and answering (NLQA), content analytics, mobile health monitoring, gesture control, activity streams, biometric authentication methods, location intelligence and speech recognition.

5. Humans better understanding machines

As machines get smarter and start automating more human tasks, humans will need to trust the machines and feel safe. The technologies that make up the Internet of things will provide increased visibility into how machines are operating and the environmental situation they are operating in. For example, IBM’s Watson provides “confidence” scores for the answers it provides to humans while Baxter shows a confused facial expression on its screen when it does not know what to do. MIT has also been working on Kismet, a robot that senses social cues from visual and auditory sensors, and responds with facial expressions that demonstrate understanding. These types of technology are very important in allowing humans and machines to work together. The 2013 Hype Cycle features Internet of Things, machine-to-machine communication services, mesh networks: sensor and activity streams.

6. Machines and humans becoming smarter

The surge in big data, analytics and cognitive computing approaches will provide decision support and automation to humans, and awareness and intelligence to machines. These technologies can be used to make both humans and things smarter. NLQA technology can improve a virtual customer service representative. NLQA can also be used by doctors to research huge amounts of medical journals and clinical tests to help diagnose an ailment or choose a suitable treatment plan. These supporting technologies are foundational for both humans and machines as we move forward to a digital future and enterprises should consider quantum computing, prescriptive analytics, neurobusiness, NLQA, big data, complex event processing, in-memory database management system (DBMS), cloud computing, in-memory analytics and predictive analytics.

Innovation tools for the Management Writer

Creativity Techniques 

The tools in this section can help you to become more creative. They are designed to help you devise creative and imaginative solutions to problems, and help you to spot opportunities that you might otherwise miss.

Before you continue, it is important to understand what we mean by creativity, as there are two completely different types. The first is technical creativity, where people create new theories, technologies or ideas. This is the type of creativity we discuss here. The second is artistic creativity, which is more born of skill, technique and self-expression. Artistic creativity is beyond the scope of these articles.

Many of the techniques in this chapter have been used by great thinkers to drive their creativity. Albert Einstein, for example, used his own informal variant of Provocation   to trigger ideas that lead to the Theory of Relativity.

Approaches to Creativity

There are two main strands to technical creativity: programmed thinking and lateral thinking. Programmed thinking relies on logical or structured ways of creating a new product or service. Examples of this approach are Morphological Analysis   and the Re framing Matrix  .

The other main strand uses ‘Lateral Thinking’. Examples of this are Brainstorming  , Random Input   and Provocation . Lateral Thinking has been developed and popularized by Edward de Bono.

Programmed Thinking and Lateral Thinking

Lateral thinking recognizes that our brains are pattern recognition systems, and that they do not function like computers. It takes years of training before we learn to do simple arithmetic – something that computers do very easily. On the other hand, we can instantly recognize patterns such as faces, language, and handwriting. The only computers that begin to be able to do these things do it by modeling the way that human brain cells work . Even then, computers will need to become more powerful before they approach our ability to handle patterns.

The benefit of good pattern recognition is that we can recognize objects and situations very quickly. Imagine how much time would be wasted if you had to do a full analysis every time you came across a cylindrical canister of effervescent fluid. Most people would just open their can of fizzy drink. Without pattern recognition we would starve or be eaten. We could not cross the road safely.

Unfortunately, we get stuck in our patterns. We tend to think within them. Solutions we develop are based on previous solutions to similar problems. Normally it does not occur to us to use solutions belonging to other patterns.

We use lateral thinking techniques to break out of this patterned way of thinking.

Lateral thinking techniques help us to come up with startling, brilliant and original solutions to problems and opportunities.

It is important to point out that each type of approach has its strength. Logical, disciplined thinking is enormously effective in making products and services better. It can, however, only go so far before all practical improvements have been carried out. Lateral thinking can generate completely new concepts and ideas, and brilliant improvements to existing systems. In the wrong place, however, it can be sterile or unnecessarily disruptive.

Taking the Best of Each…

A number of techniques fuse the strengths of the two different strands of creativity. Techniques such as the Concept Fan   use a combination of programmed and lateral thinking. DO IT   and Min Basadur’s Simplex   embed the two approaches within problem solving processes. While these may be considered ‘overkill’ when dealing with minor problems, they provide excellent frameworks for solving difficult and serious ones.

The Creative Frame of Mind

Often the only difference between creative and uncreative people is self-perception. Creative people see themselves as creative and give themselves the freedom to create. Uncreative people do not think about creativity and do not give themselves the opportunity to create anything new.

Being creative may just be a matter of setting aside the time needed to take a step back and allow yourself to ask yourself if there is a better way of doing something. Edward de Bono calls this a ‘Creative Pause’. He suggests that this should be a short break of maybe only 30 seconds, but that this should be a habitual part of thinking. This needs self-discipline, as it is easy to forget.

Another important attitude-shift is to view problems as opportunities for improvement. While this is something of a cliché, it is true. Whenever you solve a problem, you have a better product or service to offer afterwards.

Using Creativity

Creativity is sterile if action does not follow from it. Ideas must be evaluated, improved, polished and marketed before they have any value. Other sections of Mind Tools lay out the evaluation, analysis and planning tools needed to do this. They also explain the time and stress management techniques you will need when your creative ideas take off.

Concept Cards™ – Innovation Tool for Developing Insights

Concept cards is one of the tools we use in innovation consulting engagements to help client teams develop insights.   Concept Cards™ – single frame documents that make complex concepts accessible and usable in the planning process.

Imagine having a hundred or several hundred concepts from in and around your business and industry, from the environment, from psychology, from business models, and from management theory to combine and recombine – to play with to stimulate creative conversation and insight. That could be invaluable to helping to create the innovative products and services needed to be a leader in your industry.

 6-3-5 Brainwriting

6-3-5 Brainwriting (also known as the 6-3-5 Method, or Method 635) is a group creativity technique used in marketing, advertising, design, writing and product development originally developed by Professor Bernd Rohrbach in 1968.

Based on the concept of Brainstorming, the aim of 6-3-5 Brainwriting is to generate 108 new ideas in half an hour. In a similar way to brainstorming, it is not the quality of ideas that matters but the quantity.

The technique involves 6 participants who sit in a group and are supervised by a moderator. Each participant thinks up 3 ideas every 5 minutes. The ideas are written down on a worksheet and passed on to the next participant. The participant reads the ideas and uses them as inspiration for more ideas. Participants are encouraged to draw on others’ ideas for inspiration, thus stimulating the creative process. After 6 rounds in 30 minutes the group has thought up a total of 108 ideas.

Brainwriting is simple. Rather than ask participants to yell out ideas (a serial process), you ask them to write down their ideas about a particular question or problem on sheets of paper for a few minutes; then, you have each participant pass their ideas on to someone else, who reads the ideas and adds new ideas. After a few minutes, you ask the participants to pass their papers to others, and the process repeats. After 10 to 15 minutes, you collect the sheets and post them for immediate discussion.

The number of ideas generated from brainwriting often exceeds what you’d expect from face-to-face brainstorming because you’ve reduced anxiety somewhat, followed a parallel process in which a dozen people may add items simultaneously, and reduced the amount of extraneous talk that happens during brainstorming, which takes time away from idea generation.

Attribute Listing, Morphological Analysis and Matrix Analysis

Tools for Creating New Products and Services

Attribute listing was pioneered in 1931 by Robert Platt Crawford in his course on creative thinking. The technique takes an attribute or idea from one thing and applies it to another. The task of creating the ideas is more than just the process of combining things; an essential element of the process is the Attribute Listing Matrix (ALM) where the features, attributes and ideas are listed.

The Bahco Ergo Screwdriver was developed through a focus on the attributes of its handle both in terms of safety (preventing repetitive strain injury) and that at some point most people want to use a screwdriver with both hands, which meant the handle had to be redesigned.

Attribute listing is a means of getting you to focus on as many attributes of a product or problem as possible. In breaking down the elements of a problem or object, you can look at each in turn and generate new ideas. The technique is particularly useful for considering complex products or processes in that it allows you to consider each feature or stage and look at the associated attributes in detail. You can also specify the criteria by which you want to examine an attribute, for example it could be quality, cost or speed of production. You can also look at the attributes from a range of perspectives:

  • Physical attributes: shape, form, colour, texture
  • Social attributes: responsibilities, taboos, roles, power
  • Process attributes: selling, marketing, production
  • Psychological attributes: needs, motivation, emotions
  • Price attributes: cost to the customer, manufacturer, supplier
How could this basic product be changed?

Attribute Listing, Morphological Analysis and Matrix Analysis are good techniques for finding new combinations of products or services. They are sufficiently similar to be discussed together. We use Attribute Listing and Morphological Analysis to generate new products and services.

How to Use the Tools

To use the techniques, first list the attributes of the product, service or strategy you are examining. Attributes are parts, properties, qualities or design elements of the thing being looked at. For example, attributes of a pencil would be shaft material, lead material, hardness of lead, width of lead, quality, color, weight, price, and so on. A television plot would have attributes such as characters, actions, locations, and weather. For a marketing strategy you might use attributes of markets open to you, uses of the product, and skills you have available.

Draw up a table using these attributes as column headings. Write down as many variations of the attribute as possible within these columns. This might be an exercise that benefits from brainstorming  . The table should now show all possible variations of each attribute.

Now select one entry from each column. Either do this randomly or select interesting combinations. By mixing one item from each column, you will create a new mixture of components. This is a new product, service or strategy.

Finally, evaluate and improve that mixture to see if you can imagine a profitable market for it.

Example

Imagine that you want to create a new lamp. The starting point for this might be to carry out a morphological analysis. Properties of a lamp might be power supply, bulb type, size, style, finish, material, shade, and so on.

You can set these out as column headings on a table, and then brainstorm variations. This table is sometimes known as a “Morphologial Box” or “Zwicky Box” after the scientist Fritz Zwicky, who developed the technique in the 1960s.

Power Supply Bulb Type Size Style Finish Material
Battery Halogen Very Large Modern Black Metal
Mains Bulb Large Antique White Ceramic
Solar Daylight Medium Roman Metallic Concrete
Generator Colored Small Art Nouveau Terracotta Bone
Crank Hand held Industrial Enamel Glass
Gas Ethnic Natural Wood
Oil/Petrol Fabric Stone
Flame Plastic

Interesting combinations might be:

  • Solar powered/battery, daylight bulb – possibly used in clothes shops to allow customers to see the true color of clothes.
  • Large hand cranked arc lights – used in developing countries, or far from a mains power supply.
  • A ceramic oil lamp in Roman style – used in themed restaurants, resurrecting the olive oil lamps of 2000 years ago.
  • A normal table lamp designed to be painted, wallpapered or covered in fabric so that it matches the style of a room perfectly.

Some of these might be practical, novel ideas for the lighting manufacturer. Some might not. This is where the manufacturer’s experience and market knowledge are important.

Key Points

Morphological Analysis, Matrix Analysis and Attribute Listing are useful techniques for making new combinations of products, services and strategies.

You use the tools by identifying the attributes of the product, service or strategy you are examining. Attributes might be components, assemblies, dimensions, color, weight, style, speed of service, skills available, and so on.

Use these attributes as column headings. Underneath the column headings list as many variations of that attribute as you can.

You can now use the table or “morphological box”, by randomly selecting one item from each column, or by selecting interesting combinations of items. This will give you ideas that you can examine for practicality.

Notes:

  • Attribute Listing focuses on the attributes of an object, seeing how each attribute could be improved.
  • Morphological Analysis uses the same basic technique, but is used to create a new product by mixing components in a new way.
  • Matrix Analysis focuses on businesses. It is used to generate new approaches, using attributes such as market sectors, customer needs, products, promotional methods, and so on

AIDA: Attention-Interest-Desire-Action

Inspiring Action With Your Writing

© iStockphoto/LyaC

 “Free gift inside!”
“Dear Jim, You have been specially selected.”
“Calling all Parents.”

Every day we’re bombarded with headlines like these that are designed to grab our attention. In a world full of advertising and information – delivered in all sorts of media from print to websites, billboards to radio, and TV to text messages – every message has to work extremely hard to get noticed.

And it’s not just advertising messages that have to work hard; every report you write, presentation you deliver, or email you send is competing for your audience’s attention.

As the world of advertising becomes more and more competitive, advertising becomes more and more sophisticated. Yet the basic principles behind advertising copy remain – that it must attract attention and persuade someone to take action. And this idea remains true simply because human nature doesn’t really change. Sure, we become increasingly discerning, but to persuade people to do something, you still need to grab their attention, interest them in how your product or service can help them, and then persuade them to take the action you want them to take, such as buying your product or visiting your website.

The acronym AIDA is a handy tool for ensuring that your copy, or other writing, grabs attention. The acronym stands for:

  • Attention (or Attract)
  • Interest
  • Desire
  • Action.

These are the four steps you need to take your audience through if you want them to buy your product or visit your website, or indeed to take on board the messages in your report.

A slightly more sophisticated version of this is AIDCA/AIDEA, which includes an additional step of Conviction/Evidence between Desire and Action. People are so cynical about advertising messages that coherent evidence may be needed if anyone is going to act!

How to Use the Tool

Use the AIDA approach when you write a piece of text that has the ultimate objective of getting others to take action. The elements of the acronym are as follows:

1. Attention/Attract

In our media-filled world, you need to be quick and direct to grab people’s attention. Use powerful words, or a picture that will catch the reader’s eye and make them stop and read what you have to say next.

With most office workers suffering from e-mail overload, action-seeking e-mails need subject lines that will encourage recipients to open them and read the contents. For example, to encourage people to attend a company training session on giving feedback, the email headline, “How effective is YOUR feedback?” is more likely to grab attention than the purely factual one of, “This week’s seminar on feedback”.

2. Interest

This is one of the most challenging stages: You’ve got the attention of a chunk of your target audience, but can you engage with them enough so that they’ll want to spend their precious time understanding your message in more detail?

Gaining the reader’s interest is a deeper process than grabbing their attention. They will give you a little more time to do it, but you must stay focused on their needs. This means helping them to pick out the messages that are relevant to them quickly. So use bullets and subheadings, and break up the text to make your points stand out.

For more information on understanding your target audience’s interests and expectations, and the context of your message, please read about Rhetorical Triangle.  

3. Desire

The Interest and Desire parts of AIDA go hand-in-hand: As you’re building the reader’s interest, you also need to help them understand how what you’re offering can help them in a real way. The main way of doing this is by appealing to their personal needs and wants.

So, rather than simply saying “Our lunchtime seminar will teach you feedback skills”, explain to the audience what’s in it for them: “Get what you need from other people, and save time and frustration, by learning how to give them good feedback.”

Feature and Benefits (FAB)

A good way of building the reader’s desire for your offering is to link features and benefits. Hopefully, the significant features of your offering have been designed to give a specific benefit to members of your target market.

When it comes to the marketing copy, it’s important that you don’t forget those benefits at this stage. When you describe your offering, don’t just give the facts and features, and expect the audience to work out the benefits for themselves: Tell them the benefits clearly to create that interest and desire.

Example: “This laptop case is made of aluminum,” describes a feature, and leaves the audience thinking “So what?” Persuade the audience by adding the benefits”.giving a stylish look, that’s kinder to your back and shoulders”.

You may want to take this further by appealing to people’s deeper drives”… giving effortless portability and a sleek appearance and that will be the envy of your friends and co-workers.”

4. Conviction

As hardened consumers, we tend to be skeptical about marketing claims. It’s no longer enough simply to say that a book is a bestseller, for example, but readers will take notice if you state (accurately, of course!), that the book has been in the New York Times Bestseller List for 10 weeks, for example. So try to use hard data where it’s available. When you haven’t got the hard data, yet the product offering is sufficiently important, consider generating some data, for example, by commissioning a survey.

5. Action

Finally, be very clear about what action you want your readers to take; for example, “Visit http://www.mindtools.com now for more information” rather than just leaving people to work out what to do for themselves.

Key Points

AIDA is a copywriting acronym that stands for:

  • Attract or Attention
  • Interest
  • Desire
  • Action.

Using it will help you ensure that any kind of writing, whose purpose is to get the reader to do something, is as effective as possible. First it must grab the target audience’s attention, and engage their interest. Then it must build a desire for the product offering, before setting out how to take the action that the writer wants the audience to take.

Gordon Little Technique

The idea behind this problem-solving technique is to encourage you to step as far away from a particular problem as possible. Developed by William Gordon (of Arthur D Little Consulting) in the 1960s, it involves a process of progressively more detailed revelation, to avoid defining the problem too soon and limiting possible solutions. He built this approach in response to a problem he witnessed with classical brainstorming whereby people begin the process by giving what they regard as ideal or obvious solutions and then their creativity trails away.

Purpose

The purpose of the technique is to bring you out of the immediate detail of a particular problem. For example, instead of asking, “How do we get our audiences to spend another £2 each per visit,” you might ask:

  • “How do we make our audiences happy?”
  • After exploring this question in a little more detail you might ask, “How can we provide good customer service?”
  • Once answers to that question have finished you would get more specific still, “What do our audiences want from our programme/activities?”
  • Finishing with your original question, “How do we get our audiences to spend another £2 each per visit?”

It is mainly a tool for group discussion to ensure you get as wide a range of perspectives as possible, but you could try using it on your own with post-its and large sheets of paper for doodling your answers. (You would have to suspend your knowledge of the final question though!)

The Tool

This tool takes you through the technique and is ideally undertaken by a group. It is suitable for businesses of any scale or purpose. Set up a group and give yourselves enough time to work through the various layers of the problem, probably two to three hours.

The Reframing Matrix

Generating Different Perspectives

Things look different when you change perspective.

© iStockphoto

When you’re stuck on a problem, it often helps to look at it from another perspective. This can be all that you need to do to come up with a great solution.

However, it is sometimes difficult to think about what these perspectives might be.

This is when a tool like the Reframing Matrix is useful. In this article, we’ll look at how you can use it to look at problems from different perspectives.

About the Matrix

The Reframing Matrix tool was created by Michael Morgan, and published in his 1993 book, “Creating Workforce Innovation.” It helps you to look at business problems from various perspectives. Using these, you can come up with more creative solutions.

The approach relies on the fact that different people with different experiences are likely to approach problems in different ways. The technique helps you put yourself into the minds of different people, imagine the way that they would face these problems, and explore the possible solutions that they might suggest.

How to Use the Tool

The Reframing Matrix is very easy to use. All you’ll need is a pen and paper to get started.

Step 1: Draw the Grid

Start by drawing a simple four-square grid, like the one pictured in figure 1 below.

Leave a space in the middle of the grid to define your problem, and then write the problem that you want to explore in this space.

Figure 1 – Reframing Matrix Step 1

Blank Reframing Matrix

From ‘Creating Workforce Innovation’ by Michael Morgan, p.75. © 1993. First published by Allen & Unwin, New South Wales. Reproduced with permission from Allen & Unwin.

Tip:

The boxes around the grid are there for your different perspectives. If this four-box approach doesn’t suit you, feel free to change it.

Step 2: Decide on Perspectives

Now, decide on four different perspectives to use in your matrix. Two useful approaches for doing this are the 4Ps Approach and the Professions Approach.

The 4Ps Approach (not to be confused with the 4Ps of marketing  ) helps you look at problems from the following perspectives:

  • Product perspective: Is there something wrong with the product or service? Is it priced correctly? How well does it serve the market? Is it reliable?
  • Planning perspective: Are our business plans, marketing plans, or strategy at fault? Could we improve these?
  • Potential perspective: How would we increase sales? If we were to seriously increase our targets or our production volumes, what would happen with this problem?
  • People perspective: What are the people impacts and people implications of the problem? What do people involved with the problem think? Why are customers not using or buying the product?

(These are just some of the questions that you can ask as you look at your problem using these four perspectives.)

The Professions Approach helps you look at the problem from the viewpoints of different specialists, or stakeholders  . For instance, the way a doctor looks at a problem would be different from the approach that a civil engineer or a lawyer would use. Or the way a CEO sees a problem may be different from the way an HR manager would see it.

This approach can be especially useful when you’re trying to solve a problem that involves many different types of people, or if you need step away from your usual way of thinking so that you can be more creative.

Step 3: Brainstorm Factors

Finally, brainstorm   factors related to your problem from each perspective, and add these in to the appropriate quadrant of the matrix.

Once you’ve completed the matrix, you’ll have a better understanding of your problem, and you’ll be able to generate more solutions.

Tip 1:

The Perceptual Position  technique can be useful when you want to see things from other people’s viewpoints.

Tip 2:

CATWOE  has a similar approach. This asks you to look at a problem from the perspectives of Customers, Actors, the Transformation process, the World view, the Owner, and Environmental constraints.

Example Reframing Matrix

In the example in figure 2, below, a manager has used the 4Ps approach to explore why a new product is not selling well.

Figure 2 – Example Reframing Matrix

Example Reframing Matrix

Key Points

The Reframing Matrix tool was originally created by Michael Morgan, and published in his book “Creating Workforce Innovation.” It helps you to look at a problem from different perspectives.

You use the tool by drawing a simple four-square grid and putting your problem or issue in the middle of the grid.

You then choose four different perspectives that you will use to look at your problem, and brainstorm factors related to your problem, starting with each of those perspectives.

Provocation 

Provocation is a lateral thinking technique. It works by disrupting established patterns of thinking, and giving us new places to start.

A key way that we think is by recognizing patterns and reacting to them. These reactions come from our past experiences, and from logical extensions of those experiences; and it’s often hard to think outside these patterns. While we may know a good answer as part of a different type of problem, the structure of our brains can make it difficult for us to access this.

Provocation is a tool that we can use to make links between these patterns. In this article, we’ll review Provocation, and discuss how you can use it to come up with creative ideas and solutions to problems.

About the Tool

The Provocation technique was developed and popularized by psychologist Edward de Bono.

You use provocation by making deliberately wrong or unreasonable statements (provocations), in which something you take for granted about the situation isn’t true.

For instance, the statements “Cars have square wheels” or “Houses have no roofs” can be provocations.

Statements need to be outrageous like this to shock your mind out of existing ways of thinking. Once you’ve made a provocative statement, you then suspend judgment and use that statement to generate ideas, giving you original starting points for brainstorming   and creative thinking.

Understanding Provocation

Here’s a useful way of thinking about the technique.

Imagine you take the same route to work every day. You’re so used to it that you stop noticing the scenery, and you don’t even have to think about which route to take to get to your office.

We can use this as an analogy for our normal approach to brainstorming, where we habitually follow the same track, or steps, when we brainstorm. This limits our creativity, because any forward movement is based on the step or idea we had before.

Now, imagine that you’re leaving for work and, suddenly, you’re magically transported to an entirely new location. You’ve never been to this place before, and nothing is familiar! If this happened, you’d have to start figuring out where you were, and how you were going to take a new route to work.

This is what provocation does, and it’s why it can be so useful. Its purpose is to take you outside the routes that you normally think along, and put you in an entirely new place. Then, it’s up to you to work back to where you want to be.

When you do this, you’re addressing problems from a new perspective, and, hopefully, you’ll generate new ideas.

Using the Technique

Provocation is quite straightforward to use, although it can be challenging when you first start.

All you do is make a shocking or outrageous statement about the problem you’re trying to solve. Then, you begin to work back through several further steps.

Note:

The technique is most useful when your provocations are far-out. De Bono suggests that at least 40 percent of your provocations should be completely unusable. If you make “safe” statements, you won’t get the full value of the technique.

Step 1: Create the Provocation

It can sometimes be difficult to come up with a provocation, simply because our brains are hard-wired to come up with sensible solutions.

One way to get started with provocations is the “escape method.” Here, you make a statement that everyone takes for granted. This “take for granted” statement should be related to the problem you’re trying to solve. Once you’ve created a take for granted statement, you can then come up with a provocative statement to counter it.

Example:

Due to severe budget cuts, you need to come up with ways to bring in more revenue to your department for things like staff gifts, holiday parties, and little extras for the office. So, your take for granted statement would be: “We take for granted the fact that the department needs to bring in more money.”

The provocation to this assumption would be: “The department doesn’t need to earn money”.

Step 2: Create Movement/Ideas

Once you’ve made a provocation, you need to imagine what would come next. This is called the “moment-to-moment” technique. Essentially, you’re going to imagine, on a moment-by-moment basis, what comes next.

Example:

Provocation: The department doesn’t need to earn money.

Moment-by-Moment: Employees are coming to work, but not to make money. Because they’re no longer trying to make a profit for the department, they decide to start working on creative pursuits during the day.

Because the employees feel so free to be creative, they begin to come up with all kinds of product ideas, artwork, and volunteer opportunities. They start to improve the department to make it a more pleasant and stimulating place. Morale and camaraderie improves since competition isn’t an issue any longer, and the hierarchy of the department breaks down since there’s no difference between entry-level workers and management.

Keep in mind that as you use the moment-by-moment technique, you don’t have to follow one line of thinking. You’ll get the greatest value from provocation if you try to come up with several alternative ideas, stemming from your initial provocation.

There are several other ways that you can create movement and ideas from your provocation. Examine:

  • The consequences of the statement.
  • What the benefits would be.
  • What special circumstances would make it a sensible solution.
  • The principles needed to support it and make it work.
  • How it would work, moment-to-moment.
  • What would happen if a sequence of events was changed.
  • The differences between the provocation and a sensible solution.

You can use this list as a checklist to help you brainstorm.

Step 3: Extract Value

Keep in mind that your goal is not to prove that your provocation is useful or justified. Your goal is to generate ideas that are separate from the provocation.

You extract value from the provocation by taking one of those ideas, and turning it into a viable solution to your problem.

Example:

Your initial problem was to come up with ideas that would add revenue to your department. And, you came up with a few possible solutions once you used the moment-by-moment technique.

You could give employees days off from their regular work to pursue some creative ideas within the department. They might come up with some innovative products or processes that would add revenue.

Another out-of-the-box solution might be to make full use of your team’s creativity. For instance, you could encourage your team to create some art and donate to a team “art sale” for the rest of the company. The profits from each sale would go in a department fund used for holiday parties.

Provocation in Groups

Provocation is also a useful technique for encouraging team creativity  .

When using the provocation technique with someone else, or with a group, de Bono suggests using the word “Po.” This stands for “Provocative Operation.” The term is also a partial root of other words such as “possible”, “hypothesis”, “suppose” and “poetry” which, according to de Bono, all indicate forward movement, which is the purpose of the provocation technique.

De Bono suggests that when we make a provocative statement in public we label it as such with “Po” (for instance, “Po: the earth is flat”). “Po” acts as a signal, alerting everyone that the statement is a provocation and not one to be seriously considered. However, this does rely on all members of your audience knowing about provocation!

Note:

As with other lateral thinking techniques, provocation doesn’t always produce good or relevant ideas. However, sometimes it does, because it forces you to think in different and original ways. Ideas generated using provocation are often fresh, creative, and original.

Key Points

Provocation is a useful lateral thinking technique that can help you generate original starting points for creative thinking.

To use provocation, make a deliberately outrageous comment relating to the problem you’re thinking about. Then suspend judgment, and use the statement as the starting point for generating ideas. You can then move forward using the moment-by-moment technique, imagining how it would play out in the real world.

Last, you extract value from picking the ideas that might be feasible, and by developing them further.

Monroe’s Motivated Sequence

Perfecting the Call to Act

Be inspiring!

Is persuasion a gift? Are some people born with the ability to speak well and “sell” their ideas successfully?

It sure seems that way when you’re wowed by a motivational speaker, or galvanized into action by a thought-provoking presentation.

In your role, do you ever need to motivate, inspire, or persuade others?

Whether you’re a senior executive giving a presentation to the Board, a manager giving a morale-boosting speech to your team, or a production manager giving a presentation on safety standards, at some point, you’ll probably have to move people to action.

While there are certainly those who seem to inspire and deliver memorable speeches effortlessly, the rest of us can learn how to give effective presentations too. Key factors include putting together a strong message and delivering it in the right sequence.

Monroe’s Motivated Sequence: The Five Steps

Alan H. Monroe, a Purdue University professor, used the psychology of persuasion to develop an outline for making speeches that will deliver results. It’s now known as Monroe’s Motivated Sequence.

This is a well-used and time-proven method to organize presentations for maximum impact. You can use it for a variety of situations to create and arrange the components of any message. The steps are explained below.

Step One: Get Attention

Get the attention of your audience. Use storytelling  , humor, a shocking statistic, or a rhetorical question – anything that will get the audience to sit up and take notice.

Note:

This step doesn’t replace your introduction – it’s part of your introduction. In your opening, you should also establish your credibility , state your purpose, and let the audience know what to expect.

Let’s use the example of a half-day seminar on safety in the workplace. Your attention step might be as follows.

Attention Workplace safety is being ignored!
Shocking Statistic Despite detailed safety standards and regulations, surveys show that 7 out of 10 workers regularly ignore safe practices because of ease, comfort, and efficiency. Some of these people get hurt as a result. I wonder how comfortable they are in their hospital beds… or coffins?

Step Two: Establish the Need

Convince your audience there’s a problem. This set of statements must help the audience realize that what’s happening right now isn’t good enough – and it needs to change.

  • Use statistics to back up your statements.
  • Talk about the consequences of maintaining the status quo and not making changes.
  • Show your audience how the problem directly affects them.

Remember, you’re not at the “I have a solution” stage. Here, you want to make the audience uncomfortable and restless, and ready to do the “something” that you recommend.

Need Apathy/lack of interest is the problem.
Examples and Illustrations Safety harnesses sit on the floor when the worker is 25 feet above ground. Ventilation masks are used more to hold spare change than to keep people safe from dangerous fumes.
Consequences Ignoring safety rules caused 162 worker deaths in our province/state last year. I’m here to make sure that you aren’t part of next year’s statistic.

Step Three: Satisfy the Need

Introduce your solution. How will you solve the problem that your audience is ready to address? This is the main part of your presentation. It will vary significantly, depending on your purpose.

  • Discuss the facts.
  • Elaborate and give details to make sure the audience understands your position and solution.
  • Clearly state what you want the audience to do or believe.
  • Summarize your information from time to time as you speak.
  • Use examples, testimonials, and statistics to prove the effectiveness of your solution.
  • Prepare counterarguments to anticipated objections.
Satisfaction Everyone needs to be responsible and accountable for everyone else’s safety.
Background Habits form over time. They are passed on from worker to worker until the culture accepts looser safety standards.
Facts Introduce more statistics on workplace accidents relevant to your organization.
Position Statement When workers are responsible and accountable for one another, safety compliance increases.
Examples Present one or more case studies.
Counterarguments Safer workplaces are more productive, even in the short term – so workers aren’t more efficient when they don’t take the time to follow safety rules.

Step Four: Visualize the Future

Describe what the situation will look like if the audience does nothing. The more realistic and detailed the vision, the better it will create the desire to do what you recommend. Your goal is to motivate the audience to agree with you and adopt similar behaviors, attitudes, and beliefs. Help them see what the results could be if they act the way you want them to. Make sure your vision is believable and realistic.

You can use three methods to help the audience share your vision:

  1. Positive method – Describe what the situation will look like if your ideas are adopted. Emphasize the positive aspects.
  2. Negative method – Describe what the situation will look like if your ideas are rejected. Focus on the dangers and difficulties caused by not acting.
  3. Contrast method – Develop the negative picture first, and then reveal what could happen if your ideas are accepted.
Visualization Picture a safe and healthy workplace for everyone.
Contrast Method/
Negative Method
Continue the status quo (keep doing the same thing), and someone will be seriously injured. Picture yourself at a colleague’s funeral. You were right beside him when he decided not to wear his safety harness. How do you face his wife when you know you were right there and didn’t say anything?
Positive Method Consider the opposite. Imagine seeing your co-worker receive an award for 25 years of service. Feel the pride when you teach safety standards to new workers. Share the joy of your team’s rewards for an outstanding safety record.

Step Five: Action/Actualization

Your final job is to leave your audience with specific things they can do to solve the problem. You want them to take action now. Don’t overwhelm them with too much information or too many expectations, and be sure to give them options to increase their sense of ownership of the solution. This can be as simple as inviting them to have some refreshments as you walk around and answer questions. For very complex problems, the action step might be getting together again to review plans.

Action/Actualization Review your safety procedures immediately.
Invitation I’ve arranged a factory tour after lunch. Everyone is invited to join us. Your insights will really help us identify areas that need immediate attention. If you’re unable to attend this afternoon, I’ve left some pamphlets and business cards. Feel free to call me with questions, concerns, and ideas.

Key Points

For some of us, persuasive arguments and motivational speaking come naturally. The rest of us may try to avoid speeches and presentations, fearing that our message won’t be well received. Using Monroe’s Motivated Sequence, you can improve your persuasive skills and your confidence.

Get the attention of your audience, create a convincing need, define your solution, describe a detailed picture of success (or failure), and ask the audience to do something right away: It’s a straightforward formula for success that’s been used time and again. Try it for your next presentation, and you’ll no doubt be impressed with the results!

Concept Fan

When trying to think of new ideas and solutions to problems it is very tempting to go with your first ideas. However, first ideas are not always the best. Edward de Bono developed the ‘Concept Fan’ technique for taking a step back to get a broader perspective and thereby a new view of the subject, what you want to achieve and new ways of solving the problem.

Concept Fan:

1. To start a Concept Fan, draw a circle on a large piece of paper (A3 paper or White Board), just right of centre. Write the problem you are trying to solve into it. To the right of it radiate lines representing possible solutions to the problem. As shown in Figure 1 below.

2. It may be that the first ideas generated are impractical, unremarkable, or do not really solve the problem. If this is the case, take a ‘step back’ for a broader view of the problem. Do this by drawing a circle to the left of the first circle, and write the broader definition into this new circle. Link it with an arrow to show that it comes from the first circle as  in Figure 2.
3. Use this as the starting point to radiate out new ideas as in Figure 3.
4. If this does not give you the idea you are looking for then repeat the process and take another step back as in Figure 4.

The Simplex Process

A Robust Creative Problem-Solving Process

To-Do Lists
Work through the cycle.

© iStockphoto/centyr

When you’re solving business problems, it’s all-too-easy easy to skip over important steps in the problem-solving process, meaning that you can miss good solutions, or, worse still, fail to identify the problem correctly in the first place.

One way to prevent this happening is by using the Simplex Process. This powerful step-by-step tool helps you identify and solve problems creatively and effectively. It guides you through each stage of the problem-solving process, from finding the problem to implementing a solution. This helps you ensure that your solutions are creative, robust and well considered.

In this article, we’ll look at each step of the Simplex Process. We’ll also review some of the tools and resources that will help at each stage.

About the Tool

The Simplex Process was created by Min Basadur, and was popularized in his book, “The Power of Innovation.”

It is suitable for problems and projects of any scale. It uses the eight stages shown in Figure 1, below:

Figure 1: The Simplex Process

Rather than seeing problem-solving as a single straight-line process, Simplex is represented as a continuous cycle.

This means that problem-solving should not stop once a solution has been implemented. Rather, completion and implementation of one cycle of improvement should lead straight into the next.

We’ll now look at each step in more detail.

1. Problem Finding

Often, finding the right problem to solve is the most difficult part of the creative process.

So, the first step in using Simplex is to start doing this. When problems exist, you have opportunities for change and improvement. This makes problem finding a valuable skill!

Problems may be obvious. If they’re not, they can often be identified using trigger questions like the ones below:

  • What would our customers want us to improve? What are they complaining  about?
  • What could they be doing better if we could help them?
  • Who else could we help by using our core competences  ?
  • What small problems do we have which could grow into bigger ones? And where could failures   arise in our business process?
  • What slows our work or makes it more difficult? What do we often fail to achieve? Where do we have bottlenecks  ?
  • How can we improve quality?
  • What are our competitors doing that we could do?
  • What is frustrating and irritating to our team?

These questions deal with problems that exist now. It’s also useful to try to look into the future. Think about how you expect markets and customers to change over the next few years; the problems you may experience as your organization expands; and social, political and legal changes that may affect it. (Tools such as  PEST Analysis  will help you to do this.) It’s also worth exploring possible problems from the perspective of the different “actors” in the situation – this is where techniques such as CATWOE   can be useful.

At this stage you may not have enough information to define your problem precisely. Don’t worry about this until you reach step 3!

2. Fact Finding

The next stage is to research the problem as fully as possible. This is where you:

  • Understand fully how different people perceive the situation.
  • Analyze data to see if the problem really exists.
  • Explore the best ideas that your competitors have had.
  • Understand customers’ needs in more detail.
  • Know what has already been tried.
  • Understand fully any processes, components, services, or technologies that you may want to use.
  • Ensure that the benefits of solving the problem will be worth the effort that you’ll put into solving it.

With effective fact-finding, you can confirm your view of the situation, and ensure that all future problem-solving is based on an accurate view of reality.

3. Problem Definition

By the time you reach this stage, you should know roughly what the problem is, and you should have a good understanding of the facts relating to it.

From here you need to identify the exact problem or problems that you want to solve.

It’s important to solve a problem at the right level. If you ask questions that are too broad, then you’ll never have enough resources to answer them effectively. If you ask questions that are too narrow, you may end up fixing the symptoms of a problem, rather than the problem itself.

Min Basadur, who created the Simplex process, suggests saying “Why?” to broaden a question, and “What’s stopping you?” to narrow a question.

For example, if your problem is one of trees dying, ask “Why do I want to keep trees healthy?” This might broaden the question to “How can I maintain the quality of our environment?”

A “What’s stopping you?” question here could give the answer “I don’t know how to control the disease that is killing the tree.”

Big problems are normally made up of many smaller ones. This is the stage at which you can use a technique like Drill Down   to break the problem down to its component parts. You can also use the 5 Whys Technique  Cause and Effect Analysis   and Root Cause Analysis   to help get to the root of a problem.

Tip:

A common difficulty during this stage is negative thinking – you or your team might start using phrases such as “We can’t…” or “We don’t,” or “This costs too much.” To overcome this, address objections with the phrase “How might we…?” This shifts the focus to creating a solution.

4. Idea Finding

The next stage is to generate as many problem-solving ideas as possible.

Ways of doing this range from asking other people for their opinions, throughprogrammed creativity   tools and lateral thinking techniques, to Brainstorming  . You should also try to look at the problem from other perspectives. A technique likeThe Reframing Matrix   can help with this.

Don’t evaluate or criticize ideas during this stage. Instead, just concentrate ongenerating ideas  . Remember, impractical ideas can often trigger good ones! You can also use the Random Input   technique to help you think of some new ideas.

5. Selection and Evaluation

Once you have a number of possible solutions to your problem, it’s time to select the best one.

The best solution may be obvious. If it’s not, then it’s important to think through the criteria that you’ll use to select the best idea. Our Decision Making Techniquessection lays out a number of good methods for this. Particularly useful techniques include Decision Tree Analysis  Paired Comparison Analysis  , and Decision Matrix Analysis  .

Once you’ve selected an idea, develop it as far as possible. It’s then essential to evaluate it to see if it’s good enough to be considered worth using. Here, it’s important not to let your ego get in the way of your common sense.

If your idea doesn’t offer a big enough benefit, then either see if you can generate more ideas, or restart the whole process. (You can waste years of your life developing creative ideas that no-one wants!)

Techniques to help you to do this include:

  • Risk Analysis  , which helps you explore where things could go wrong.
  • Impact Analysis  , which gives you a framework for exploring the full consequences of your decision.
  • Force Field Analysis  , which helps you explore the pressures for and against change.
  • Six Thinking Hats  , which helps you explore your decision using a range of valid decision-making styles.
  • Use of NPVs and IRRs  , which help you ensure that your project is worth running from a financial perspective.

6. Planning

Once you’ve selected an idea, and are confident that your idea is worthwhile, then it’s time to plan its implementation.

Action Plans   help you manage simple projects – these lay out the who, what, when, where, why and how of delivering the work.

For larger projects, it’s worth using formal project management techniques. By using these, you’ll be able to deliver your implementation project efficiently, successfully, and within a sensible time frame.

Where your implementation has an impact on several people or groups of people, it’s also worth thinking about change management  . Having an appreciation of this will help you assure that people support your project, rather than opposing it or cancelling it.

7. Sell Idea

Up to this stage you may have done all this work on your own or with a small team. Now you’ll have to sell   the idea to the people who must support it. These may include your boss, investors, or other stakeholders involved with the project.

In selling the project you’ll have to address not only its practicalities, but also things such internal politics, hidden fear of change, and so on.

Tip:

You can learn more about how to get support for your ideas with our Bite-Sized Training Session, Sell Your Idea.

8. Action

Finally, after all the creativity and preparation comes action!

This is where all the careful work and planning pays off. Again, if you’re implementing a large-scale change or project, you might want to brush up on your change management skills   to help ensure that the process is implemented smoothly.

Once the action is firmly under way, return to stage 1, Problem Finding, to continue improving your idea. You can also use the principles of Kaizen   to work on continuous improvement.

Key Points

Simplex is a powerful approach to creative problem-solving. It is suitable for projects and organizations of almost any scale.

The process follows an eight-stage cycle. Upon completion of the eight stages you start it again to find and solve another problem. This helps to ensure continuous improvement.

Stages in the process are:

  • Problem finding.
  • Fact finding.
  • Problem definition.
  • Idea finding.
  • Selection and evaluation.
  • Planning.
  • Selling of the idea.
  • Action.

By moving through these stages you ensure that you solve the most significant problems with the best solutions available to you. As such, this process can help you to be intensely creative.

Innovations Management for the future

History reminds us that at every moment of economic upheaval and transformation, this nation has responded with bold action and big ideas.” As President Barack Obama addressed a joint session of Congress on Tuesday, Feb. 24, he took a moment to look back, pointing to the innovations that have arisen from times of difficulty: the railroad tracks, laid across the country in the midst of the civil war; the public high school system that emerged from the Industrial Revolution; the GI Bill that sent a generation to college. Obama’s theme was clear: Times of economic difficulty can inspire extraordinary innovation. And now, even as the markets continue their roller-coaster ride, he described a time “to put in place tough, new common-sense rules of the road so that our financial market rewards drive and innovation and punishes shortcuts and abuse.”

After talking with several trend-watchers and futurists about the kind of innovations they expect to come from this recession. Along with Obama, they focused on themes of energy and health care, with technology and computing rounding out their wish lists. All saw the opportunity to reframe problems to come up with radically new solutions. “There’s a reason why they call them market corrections,” says author and futurist David Zach. “Things that don’t work, are inefficient, out of date, or bloated often need to be bypassed.” He sees this scenario developing in realms such as the Web, with access to high-speed Internet overcoming geographical barriers to allow ever greater marketplace participation.

On the energy front, the big advances will be in biofuels and renewable sources. Giant turbines will harness the power of ocean currents. Biofuels that won’t drive up global food prices are being made. Technology will repurpose the energy from the human body to recharge our cell phones and music players. Super-charged batteries that hold more juice and require fewer charges will power electric cars and laptops.

Innovations in Health Care

In health care, self-diagnostic technologies that can be used at home will replace costly doctor visits. Heavy, unwieldy medical equipment that until now has been laboriously wheeled around hospital floors is being transformed into portable machinery that can be used at home or in a remote village. New nanotech and biotech drugs will cure decimating diseases. And the health-care system itself will be overhauled, with digitization of patient records cutting costs and increasing transparency and reliability of care. Glen Hiemstra, author and founder of Futurist.com, wants to see universal coverage, while allowing folks to purchase insurance privately. “Health care is at the center of almost all business-labor issues. Moving away from employer-provided health care will free us like almost nothing else I can think of,” he says.

Given the subprime debacle, the ensuing economic meltdown, and the current ups and downs of the markets, it’s perhaps not surprising that many of  the experts are predicting drastic changes in the world’s financial systems. “I think we’re going to see the creation of new instruments and tools to value, trade, and share wealth and currency,” says trend-spotter and journalist Josh Spear.

Of course, longed-for innovations don’t always make it to the market. Radically new ideas for transportation were on most of the futurists’ wish lists, but the chances of a high-speed cross-country train within the U.S. still seem slim . But, as vehicle sharing and trackable, more reliable, and eco-powered buses gain popularity, chances are that better urban transit will become a reality.

Looking into the future is uncertain, messy work, but if businesses view the current economic crisis as an opportunity to innovate, we may still be marveling at these breakthroughs decades from now.

OpenInnovation

OpenInnovation

OPEN INNOVATION

The emergence of Open Innovation means, among other things, that innovation management will become more collaborative and that business model innovation will become as important as technological innovation. This author, who coined the term Open Innovation and literally wrote the book on it, has excellent advice for readers.

Those who study innovation often can be overwhelmed by the variety and speed at which clever new products and services come into the market.  But it is helpful to take a step back from these myriad innovations to reflect on the evolution not only of the technologies themselves but also the processes used to create, develop and manage them.  While the latest technologies of the recent past (think of Facebook, Twitter, Android, iPod/iTunes/iPhone/iPad, just to name a few) get most of our attention, it is often the processes that led to the creation of these technologies that may prove more enduring.  These also are being innovated.

Let’s look back in time at previous management innovations to gain some perspective.  In the 1960s, a significant management innovation that fostered the development of many technologies of that era was the concept of Systems Analysis that Robert McNamara and his colleagues brought into the U.S. government, particularly in the defense sector.  This was a comprehensive way to assign priorities and allocate resources to competing projects, based on their total costs and benefits.

In the 1970s, the success of the Apollo missions to the moon spurred a management innovation in project management.  Program Evaluation Research Techniques (PERT) were developed to map sequences and dependencies in complex projects, so that the critical path that determined the date at which a project could be completed could be identified.  PERT charts also helped companies assess trade-offs when activities on the critical path began to fall behind schedule.

The 1980s saw the rise of Total Quality Management in the U.S., as the principles of Deming and Juran, which had been so influential in Japan in the 1970s, finally found acceptance in their home country.  Related processes like Six Sigma became widespread, as the U.S. struggled to compete with higher quality products from Japan in many technology-based industries, as well as autos.

The 1990s witnessed the spread of supply chain management, as companies invested in sophisticated software and other tools to link themselves more and more closely with their key suppliers.  Data were shared extensively with suppliers, as companies streamlined inventories and took more costs out of their supply chains.  The rise of the Internet allowed companies to link their supply chains closely to customer demand, as companies like Dell and Amazon took orders first, then organized the fulfillment of the customer’s order through their network of suppliers.  This greatly facilitated the globalization of these supply chains.

What about the most recent decade just ended?  While there hasn’t been much time yet to assess the many possible management innovations over the past decade, one plausible suggestion is that this past decade has been the decade of Open Innovation.  In this decade, companies started to open up their research and development processes, involving customers, suppliers, universities, third parties and individuals in the innovation process.  The old “Not Invented Here” syndrome that restricted the use of external ideas has been largely rejected now in most industries.  We also see more companies allowing their unused internal ideas and technologies to go outside for others to use in their business.  With Open Innovation, companies are innovating more with fewer internal resources, saving time, reducing risk, and identifying new markets.

One set of facts to support the rise of Open Innovation in this past decade comes from a Google search of the term “open innovation.” Henry did such a search back in 2003 when he first published the book, Open Innovation.  At that time, Google returned about 200 page links, mostly of the variety “company X opened its innovation center at location Y.”  In the summer of 2010, that same search yielded 13 million links, with the vast majority reflecting a new model of industrial innovation.

MANAGEMENT INNOVATIONS IN THE NEAR FUTURE

Armed with this perspective, where might management innovation go from here? Henry  offer three short predictions:

First, innovation management will become more collaborative.  Opening up the innovation process will not stop with the accessing of external ideas and the sharing of internal ideas.  Rather, it will evolve into a more iterative, interactive process across the boundaries of companies, as communities of interested participants work together to create new innovations.  Organizations like Syndicom, for example, have already established a community of spinal surgeons who meet up virtually to share effective protocols for screening patients for new therapies, and new methods and techniques to achieve better patient outcomes for those new therapies.  Companies will increasingly compete on the breadth, depth, and quality of their communities that surround their activities.  New technologies like agile software development will help companies interact more intensively – and more productively – with current and potential customers, elevating them to full partners in the innovation process.

Second, business model innovation will become as important as technological innovation.  The business model is the predominant way a business creates value for its customers and captures some piece of that value for itself.   It is generally accepted that a better business model can often beat a better technology.  Yet companies that spend many millions of dollars on R&D seldom invest much money or time in exploring alternative business models to commercialize those discoveries.  Not all business models are created equal, and we will learn how to design and improve business models in the coming decade.

There is an irony here.  Companies that spend millions of dollars on developing new ideas and technologies often lack any process for exploring alternative business models to commercialize those new ideas and technologies.  This is a situation that cannot stand indefinitely.  Through devices like the business-model canvas of Alex Osterwalder, organizations are learning techniques to visualize both their current business model as well as possible alternative models.

A further imperative driving business-model innovation is the rebalancing of the global economy, with the bulk of economic growth over the next few years coming from the so-called emerging economies.  Companies wishing to expand their business to the rising emerging economies will find that the business models that succeeded in the already-developed economies will not succeed in these new markets.  In turn, the rise of multinational companies from the BRIC economies that successfully enter the advanced economy markets in the West with new and different business models will further advance this trend.  Companies will need to learn how to manage multiple, sometimes even conflicting, business models at the same time in different parts of the world.

Third, we will need to master the art and science of innovating in services-led economies.  Most of what we know about managing innovation comes from the study of products and technologies.  Yet the world’s top 40 advanced economies today derive most of their GDP from services rather than products or agriculture.  In order to preserve prosperity and high-wage employment in the advanced economies, we will have to learn how innovation works in services, which is likely to differ from how it works in products.  If we incorporate the above two predictions as well, one can predict that the winning formula for managing innovation in the next decade will be an open-services innovation approach.

It is worth expanding upon this last prediction, which is the subject of Henry latest book, Open Services Innovation.  The first step toward successful services innovation is recognizing that the customer is at the heart of service innovation.  A service is an intangible – something that has value that you can’t drop on your foot.  The value is the experience of the customer who receives it.  Since experience is subjective by its very nature, two people may perceive the same service quite differently.  Therefore, innovating in services requires different tactics than innovating products.  For example, inviting the customer to participate as a co-innovator is a powerful method for business leaders to harness this subjectivity and differentiate their companies from competitors, all the while creating more value for customers.  Lego reached an entirely new market of teachers when it allowed its customers to modify its Mindstorms software to manipulate its robotics for kids, and the teachers realized they could use the service to construct a curriculum to teach robotics to middle school students.

Business leaders also need to realize that since service businesses often are people-intensive, growing one profitably will require focusing on core strengths on the one hand, while providing a wide variety of choice to customers on the other.  Focus and variety are often at odds with one another.  The only way to do both profitably is to open up the business, turning it into a platform for others to work alongside or build on top of.  Opening up the business to others allows companies to provide one-stop shopping to customers, while leveraging their core activities that comprise the structure of the platform.  Amazon allows merchants to use its internal tools to build web pages on Amazon to offer merchandise to Amazon customers, who cannot tell whether the item they purchase is from Amazon (such as books) or somewhere else (such as jewelry).  So Amazon focuses on its core strengths in Internet retailing, and provides a structure for many third parties to sell a wide variety of merchandise to Amazon customers, without taking on the inventory and merchandising risks of that expanded set of products.

Finally, focusing on service innovation, making customers central to the process, and opening up to other companies require embracing a good deal of internal change for most companies.  This means that opening service innovation will change your business model.  Open service innovation will require companies to charge customers in new ways, use different mechanisms for payment, and perhaps find additional revenue streams to support the business.  Opening up to outsiders will often require sharing financial risks and rewards with them.  Traditional competitors may become customers or partners in the new business model, and there may be multiple and sometimes conflicting distribution channels for one’s offerings to reach the market.

Companies that undertake services innovation are learning to tackle these challenges.  IBM’s Global Services business supports competitors’ products at its customer locations, and shares technical information with competitors who support its products as well.  Its services business now accounts for well over 50 percent of its revenues, and is also growing IBM’s profits.  Xerox now offers to manage all of its customers’ copiers and printers, regardless of who made them.  Its services business is also growing rapidly, accounting for more than 25 percent of its sales.  The company recently acquired Affiliated Computer Services for $6.4 billion, which will further expand its services activities.

Innovation is constantly changing, as is the process by which new ideas and technologies get to market.  Companies who rest on their laurels may do well for the moment, but it is safe to bet that the innovation process is changing, whether the company realizes it or not.  The best approach is to embrace the idea that innovation will continue to change, and that organizations that seek to profit from innovation must take on the challenge of changing with it.

CASSANDRA has every reason to fear innovation. A particularly nifty piece of Greek engineering once brought doom to her family in the shape of a horse. By contrast, Henry Chesbrough, faculty director of the Garwood Centre for Corporate Innovation at Berkeley’s Haas School of Business, embraces new ideas.

His three predictions for 2014 concern areas from research and education to venture capital and Asia’s service industry, as seen below.

1. Universities will be increasingly disrupted by both new technologies and society’s demands. The advent of MOOCs (massive open online courses) will continue to challenge the fundamental architecture of the university, which bundles teaching and research into a single organisational entity. Because world class instructors are available to anyone, via the internet, pressure on the teaching portion of the university will not ease in 2014. Meanwhile, the research mission of the university will compete with the work of other providers. For example, the European Union’s Horizon 2020 funds (for research and innovation programmes) will shift from supporting basic scientific inquiry to more applied endeavours. This will seek to make a commercial or industrial impact in society. While it will take years, a major rethink of the role and structure of the university is in order.

2. Corporations will increase their presence in the venture capital world. Traditionally viewed as “dumb money” by seasoned venture capitalists, corporate venture capital (CVC) is making a surprising, sustained comeback. In certain sectors, such as renewable energy or the life sciences, CVC accounts for nearly half of the venture money being invested. This is due in part to CVC’s ability to wait patiently for startup ventures to build their businesses, and also partially because corporations are often the most likely exit for most of these ventures. Even the most skeptical private venture capital firms are seeking to partner with CVCs as a result.

3. Services innovation is coming to Asia. Whether it is Japan, looking to translate its technological prowess into new growth, China, looking to increase domestic consumption, or Korea, looking to escape the commodity trap, many leading Asian companies are starting to invest time and money in enhancing their service offerings. Even traditional manufacturing companies are finding that services provide a welcome nudge to profits, and increase customers’ satisfaction with their products as well. However, the business culture in leading Asian economies is very engineering-focused, and it will be a challenge to promote executives with deep service experience to the top levels of leading firms.

Innovation Management Concepts

Innovation Management Concepts

 

Customer-based innovation is about finding new and more profound ways to engage with customers  and develop deeper relationships with them.  This is  the most important concept of all in terms of investment priority for the coming years. Customer-based  innovation is driven strongly by the convergence of three key trends:
Total customer experience: Driven by a desire to build a deeper relationship with the customer, what  used to be a business model for B2B businesses with only a limited customer base is quickly  developing within other spheres. Japanese and German vehicle manufacturers (for example Lexus, Infiniti and BMW) continue to explore ways of designing an „ownership experience‟ rather than just a  car, designing service and support at all touch points with the same care as they design the cars. Such skills will serve them well as we move towards electric cars and need to manage customer acceptance  issues around batteries and their replacement.

Design-in emotional aspects: The second trend emerging in this space is the realization that, as technology allows manufacturers to deliver as much and often more functionality than the typical consumer can use, the bases of competition will change. Rather than compete on yet more features and functions we will see manufacturers compete even more on style, design and emotional connection, with approaches used in the luxury and fashion markets being increasingly adopted in more traditional sectors.

Social networking: The third converging trend is closely linked – the use of social networks to underpin companies‟ propositions and relationships with their customers. Increasingly this will span B2B as well as B2C: for example in recent work for a financial institution we explored how they might develop tools to allow banks to build relationships and communities within and between the finance directors of large corporate.

PROACTIVE BUSINESS MODEL INNOVATION

A business model defines how to create and capture value within a value chain, considering both  operations and strategy. Business model innovation as a concept is certainly nothing new, but there is  still much to be done to develop a convincing innovation management approach that is sufficiently  systematic and repeatable to generate new, innovative business models.
We expect to see three key trends in successful business model innovation in the future.
Deliver “thick value”: First and foremost, consumers and  stakeholders will require companies to target more the  creation of “thick value”. Today business still often focuses on the creation of “thin value”, i.e. purely profit driven transactions between the organization and its  stakeholders, as opposed to “thick value” which considers  more lasting stakeholder value, for example increasing the  resilience of stakeholders in the face of global societal and economic pressures such as climate change,  demographics or energy security. As part of the business model innovation process, organizations will need to identify new types of thick value – purpose-driven stakeholder transactions – to fill un articulated needs both meaningfully and profitably

A good example is the “closed loop” approach taken by some chemical and cleantech companies:
AkzoNobel takes back “used chlorine” from its customers; Umicore helps mobile phone and car manufacturers to include the recycling of products into the overall value proposition to B2B and B2C customers. Often this leads to new business concepts, for example leasing instead of purchase.
Use modular approaches to cope with complexity: The need to be global and act local greatly increases the complexity of managing the business. We expect that companies will increasingly need to take a modular approach to business models – innovating such that different modules can be used as building blocks in a range of market environments, each supporting the overall strategy of the company.
One simple example of this is Unilever who employ the “Unilever Ladies” to distribute Unilever products to small villages.
Adapt business models to new markets: Dealing with globalization requires a more significant effort than just “copy-pasting” the existing business model in a new market. Exporting an existing business model to a new market may not be successful. There is an important need for companies to find better ways to generate innovative business models proactively to meet the needs of new markets, or to respond to new developing world competitors, such as the developing “middle segment” of China and India.
FRUGAL INNOVATION / REVERSE INNOVATION
Frugal Innovation, sometimes referred to almost interchangeably as „Reverse Innovation‟, is all about originating and developing innovations in lower-income, emerging markets, taking the needs of poor consumers as a starting point, then transferring, adapting, applying and distributing them in developed markets. This is the opposite of the traditional innovation approach, which has been to develop innovations in the higher value “knowledge economies” of the developed world, to use the emerging markets as a low-cost manufacturing resource, and sometimes to strip the product or service of unnecessary cost and functionality to enable it to compete in the emerging markets.
A good example of frugal product innovation is the hand-held electrocardiogram (ECG) machine that was invented in GE‟s Bangalore laboratory. It‟s portable, light, battery- or mains-operated, reliable, cheap (40% of a conventional ECG). ECG test costs have gone down to a level (about $1 per ECG)that many people in industrializing countries can afford. Interestingly, after India and China, the product is now also launched in the US.
Frugal innovation brings about a rethinking of the nature of innovation. Instead of “more” it is often striving for “less”, using clever technology to create masterpieces of simplification in mobile phones, computers, cars and financial services. Frugal innovation clearly is not just about innovating products, often changes in the whole supply chain are involved.
Frugal Innovation has major implications for companies:
 Innovation systems rapidly have to implemented globally – you have to be where the
shifting action is
 “Frugality” has to become a facet of the innovation mindset of every company (Philips‟
“Sense and Simplicity” concept is an interesting example)
 More flexible and open-minded innovation approaches are needed as the “affordability”
orientation becomes more important
4. HIGH SPEED / LOW RISK INNOVATION
The drive to reduce time to market and selectively increase the speed of product cycles shows no sign of slowing over the next 10 years. One aspect that is set to become increasingly critical is the importance of getting to market not just fast, but also accurately and with no flaws. Due to the rise in global brands and the arrival of vivid, uncontrolled, ubiquitous mass communication, there is the potential for immense destruction of shareholder value from any flaw in product or service. We therefore expect to see further development of approaches and tools to drive fast, de-risked product and service innovation.
Here are some examples:
Trial and experiment: We expect to see ever-increasing use of the trial and the experiment, starting already in the functional specification phase. At the „fuzzy front end‟ this will be through increasing use of virtual prototyping and immersive 3-D visualization software to develop both products and services. Lead customers will become ever more involved. Simultaneously we will see „open innovation‟ become more sophisticated as lead customers become accepted as part of service and product delivery. Google and Microsoft are but two examples of organisations that have already embraced this.Global 24/7 product/service development: Simultaneously we will see the maturation of a trend
towards truly global innovation management teams. This will be supported by the continuing development of product design, management and prototyping tools. Global teams with virtual organisations will allow 24/7 development in pursuit of speed. More importantly they will allow a wider range of cultures and perspectives to be brought to bear in product creation. This will be vital as global platform products are customized for local success, marking the shift of the locus of power from developed economies to the emerging economies.
Gradual product rollouts: We expect to see less dramatic big launches and more of a continuing roll-out when new products and services are released to their markets. Microsoft‟s gradual launch of Office 2010 which progressed through beta trials and early versions that could be later upgraded to full versions was an example of this in practice. The approach reduces risk, both for the manufacturer and the user and will become crucial as systems become ever more complex and interrelated. Integrated Innovation is all about taking innovation approaches that were once the domain of New Product Development (NPD) only – such as idea management, stage gates and portfolio optimization – and applying them consistently as an integral part of business strategy to achieve not only growth but also competitiveness.

We expect this to be a key focus over the next decade: our survey revealed that the proportion of CTO/CIOs who rate “integration of innovation into business strategy” and “seamless cross-functional innovation processes” High or Very High went up from some 30% over the last decade to around 90% for the coming decade, one of the highest increases in the survey.
There are several factors driving this integration. First, companies are increasingly adopting team-based approaches to combine resources across
traditional functional divisions such as Marketing, R&D and Manufacturing. This enables them to respond better to the ongoing blurring of product and service, ever closer customer involvement and the need for ever faster responsiveness.
Second, we expect that businesses will increasingly need to look towards more radical innovation in order to stay ahead of the pack: for example in our survey, CTOs expected the proportion of innovative new products in adjacent and new business areas to be nearly 3x as big as it was in the last decade:Such an increase would have fundamental consequences for the nature of innovation management, for example in the way that companies organize themselves to manage and assimilate such a rapidly growing portfolio of new products, services and businesses, often in untried markets and exposed to much greater risk.
Third, there is great scope for improvement in the application of formal innovation management approaches outside the realm of NPD. Business leaders are getting better at understanding innovation tools and techniques. Innovation, like other disciplines, is going through a maturity cycle. Approaches that were the realm of the specialist 10 years ago, such as idea management or strategic portfolio management, have become mainstream. The new challenges lie in how to apply these approaches effectively across the rest of the business.
In summary we see the following aspects of Integrated Innovation as being important for the future: Innovation integral to business strategy: Many companies already claim innovation as being integral to business strategy, but struggle to explain exactly how this happens – more post-event justification than reality. As innovation tools, including especially radical innovation tools, become more embedded throughout the organization, we expect that leading companies will become much better at applying them more purposefully and effectively in a corporate strategy context
Systematic non-NPD innovation: This means greater and more consistent application of formal innovation tools and approaches to improve the effectiveness of proactive innovation in non-NPD areas such as management processes, manufacturing operations, business models, supply chain and sustainability. This will also include greater application of innovation management tools for cost reduction and competitiveness improvement: for example in our survey, in the next 10 years CTO/CIOs expected innovation to yield nearly double the equivalent reduction in unit costs achieved in the last 10 years.
Embedded innovation process ownership: We expect to see ownership of the innovation process shifting increasingly outside the Technology and R&D functions, ultimately becoming fully embedded in other business functions. We expect to see innovation performance being measured more explicitly across these functions, somewhat analogous to the way Quality management has evolved.
Radical/disruptive innovation: There will be a need for increasing proficiency and effectiveness in applying techniques to focus especially on radical innovation and new growth opportunities in adjacent or completely new business areas. This will entail finding ways to integrate innovation disciplines even further into business strategy.The next decade is set to be even tougher than the last in terms of the need for innovation, something the vast majority of global executives recognise. To stay competitive companies are going to have to up their game, especially in terms of innovation in adjacent/new business areas, and in managing the complexity of truly global, decentralized innovation resources:New technology-based business development and venturing: This will take an ever-increasing proportion of their efforts, as companies strive to grow and maintain competitiveness through building products and services in adjacent and new business areas
Innovation process management: Companies will need to find new ways to manage their innovation process. The new processes will need to connect much more intimately with customers, to enable application of innovation holistically across the whole of the business, to increase speed to market, to enable development of new business models and to encourage new dimensions such as frugal innovation across a global innovation network
Knowledge management: Complexity, integration, speed and globalization all mean that excellence in knowledge management, including sources external to the company, is going to be more crucial than ever in the next decade.
Orchestrating decentralized competence centres: Companies‟ investments in Innovation are becoming more and more global, primarily for companies in developed countries. Asia has seen by far the largest inflow of R&D investments from 13% in 2002 to 19% in 2007 of total world R&D expenditure, according to the UNESCO institute for statistics.
.

Next Generation of Green IT

greenIT triangle services

Green IT triangle services

A decade ago, green computing was a concept mostly relegated to a handful of granola-crunching environmentalists who just happened to have the title of CEO slapped in front of their name. Then came the Great Recession, a radical jump in the cost of energy and a younger generation that embraces sustainability. Suddenly, green IT represented a bona fide way to slash utility bills.

Today, green computing has gone mainstream in the enterprise. Computer manufacturers design servers and storage devices for greater energy efficiency, virtualization and cloud computing are viewed as tools to reduce data-center energy costs, and CIOs increasingly focus on ways to architect networks and data centers for maximum efficiency.

However, there’s a fundamental problem: As organizations become more reliant on the Internet, networked communication and cloud computing for mission-critical tasks, there’s near-zero tolerance to sacrifice performance for energy savings. Straddling the line between these two distinct worlds is daunting, and the growing complexity of IT is sometimes overwhelming. No CIO wants to take a hit—and possibly forfeit customers—because of energy efficient but subpar systems.

The upshot? Some executives are taking a close look at workload management software and other tools that optimize IT environments. A handful of vendors, such as Adaptive Computing and VMTurbo, have taken aim at high performance computing and cloud environments that are at the center of this next generation of green IT. These systems consolidate workloads, power down components when they aren’t required and offer processors optimized for energy efficiency, such as Intel’s Xeon chip.

In fact, the words “performance” and “energy efficiency” are no longer an oxymoron. At the University of Tennessee, Knoxville, the Beacon supercomputer is now one of the fastest and most powerful systems in the world but it is also the most energy efficient. It operates at a ratio of 2.5 gigaflops per watt. Meanwhile, the fastest supercomputer in the world, Oak Ridge National Laboratory’s Titan (which uses both GPUs and traditional CPUs to achieve a rating above 10 petaflops), is about 10 times more powerful than its predecessor but draws only slightly more power.

Sustainability and green IT are not getting any easier. Many organizations have already snagged the low hanging kilowatts. However, for savvy CIOs, the next generation of savings is both a challenge and an opportunity. The bottom line is that there’s still green in green IT.

In a “Jetsons”-like future, refrigerators will know when we’re low on items such as cheese and beer and send a message to our GPS-equipped cell phones to remind us to pick up a wedge and a six-pack the next time we walk into our favorite grocery store — and thus prevent an extra 20-mile jaunt in our 2,000-pound car for a few items. Such a future is just around the corner, Kammen says.

“Smart hardware won’t solve our consumption addiction, but it will allow us to be much more efficient,” he says. “And movement of goods around is a big deal.”

Kammen and his colleagues are currently matching up energy and information technologies with a smart phone application that lets people take a virtual test drive of an electric vehicle such as the Nissan Leaf or Chevrolet Volt. The app sits on a GPS-equipped smart phone and rides along with drivers in their current car. Then, the users can go online, upload their data, and learn what their energy consumption would have been if they were driving an electric ride.

The energy costs of information technology (IT) are becoming increasingly visible for enterprises in a variety of industries.  In the past, energy consumption was not seen as a priority issue in the data center and the split of responsibilities between IT and facilities organizations within the enterprise made it easy to skirt the issue of energy efficiency.  However, according to a recent report from Pike Research, this dynamic is changing quickly, and the cleantech market intelligence firm forecasts that by 2015, global investment in energy efficient data center technologies will represent 28 percent of the $150 billion data center infrastructure market.

“The green data center has evolved in response to concern over energy use, but it is also connected to the broader transformation that data centers are undergoing,” says senior analyst Eric Woods.  “Data centers of the future will be more energy efficient, more adaptable to new business needs and new technology opportunities, and virtualized to ensure optimal use of IT resources, space, and energy.”

As part of its analysis, Pike Research has identified seven key trends that are shaping the future of the green data center

IT managers are recognizing the energy and environmental costs of the continuing expansion of computing power, and are actively looking for ways to counteract them.

Over the next five years, data centers will move toward a totally virtualized environment that can provide computer services from both public and private cloud models.

The life cycles of power and cooling infrastructure will become more aligned with the IT assets it supports

More dynamic data centers will require more sophisticated management tools and a holistic view of the entire ecosystem

The relationship between the data center and the business it serves is changing. If the data center is to be part of a broader sustainability program, then its true cost must be more visible to the business.

Power usage effectiveness (PUE) ratings are a first step for new data center metrics, but PUE hides as much as it discloses and more work will be needed to define an acceptable measure for the productivity of the data center.

Modularization in data center design will be combined with more flexible approaches to provisioning. This is part of a broader shift to an industrialized view of the data center.

For greener Data center

IT departments today already have to struggle with a wide array of influences, pressures and goals.  Now they have yet another thing to think about: The environment and how their technology decisions may impact our planet.  But according to Logicalis, an international provider of integrated information and communications technology (ICT) solutions and services, green doesn’t have to be hard.

One place to start when looking toward a “greener” future for your organization is inside the data center itself.  When cooling accounts for as much as 40 percent of the cost of powering a typical midsize data center (around 2,500 square feet), evaluating the data center’s airflow can be one of the best green decisions you’ve ever made – both in terms of the environment and the dollars you can save.

“Airflow is one of the easiest ways to impact your company’s bottom line by reducing power consumption,” says Bob Mobach, practice director, data center infrastructure, at Logicalis.  “By making a few minor adjustments, you can save money, significantly reduce your electricity requirements, and create a better functioning, more comfortable data center environment as a whole.”

Why GreenIT

Because the world’s appetite for energy is outpacing production of renewable and non-renewable resources. Because the world is too densely populated to escape the effects of Greenhouse gas emissions, electronic waste disposal and toxic production methods.Because ICT is both part of the problem and a key to the solution.Because to thrive requires combining social responsibility, smart resource use and technological innovation.

Global carbon emissions attributable to ICT have been estimated at 2% to 2.5% of world totals – about the same as the airline industry – and as high as 5-6% of developed nation totals. McKinsey forecasts that the ICT sector’s carbon footprint will triple during the period from 2002 to 2020. For office buildings, ICT typically accounts for more than 20% of the energy used, and in some offices up to 70%. Although energy costs typically comprise less than 10% of an overall IT budget, in a few years they could rise to more than 50% according to a 2006 Gartner report. Many large organizations – such as Google – already claim that their annual energy costs exceed their server costs.

  • Radical improvements in waste reduction and energy use rely on innovative applications of information technology:
  • Telework can reduce not only automobile travel but overall energy use by reducing the amount of dedicated office space.
  • Smart energy applications adjust energy consumption to real time need patterns and climate conditions, resulting in drastic reductions in waste.
  • Virtualization can eliminate wasteful network equipment, reducing energy and floorspace.

In November, Citi opened a state-of-the-art data center in Texas. The new 305,000-square-foot facility features computer servers that are “virtualized” so each can do the work of 10 older models, software that alerts operators if systems aren’t running efficiently, and pollution controls on emissions and water usage. The new center is part of an effort to cut in half the 52 data centers Citi operates worldwide. Those centers-often kept as cold as an icebox so computers will run optimally-account for 24 percent of the company’s power usage.

Working with HP, Citi has saved over $1 million so far on power and cooling expenses by consolidating and “virtualizing” roughly 15 percent of its 42,740 servers worldwide. “We did an initial analysis and it pretty much paid for itself when we looked at what we were paying for some of the older data centers and what capacity we’d need,” says Jack Glass, Citi’s senior vice president for data-center planning.

Those efforts show where the future of I.T. lies: more energy-efficient hardware, software that helps manage power usage, zero-emissions facilities, and lower thermostats in data centers.

While the global economy has tanked, spending for green I.T. is soaring. Forrester Research, a technology market research firm, expects the $500 million spent on green I.T. services in 2008 will grow to $4.8 billion by 2013. While the economy has made it harder to go green simply for public relations value, executives are finding other reasons. Tech cycles are short and, as older hardware needs to be replaced, firms are consolidating and upgrading to greener models. Some global executives are also hedging their bets as they await U.S. regulations to cap carbon emissions; a new emission trading law goes into effect next year in the U.K.

But financial payback is even more compelling. The potential for quick savings from green I.T. has caught the attention of corporate giants and small businesses. Technology pioneers Microsoft and Google are building green data centers near cheap hydroelectric power sources in the Pacific Northwest. The world’s fifth-largest commercial airline, Continental, has saved more than $2 million through server virtualization. Highmark, a health insurer in Harrisburg, Pennsylvania, with 4.5 million members, cut its electric bill by 10 percent last year, and halved its 400 servers, by building a more efficient data center with help from IBM. “That only amounted to about $52,000,” says Mark Wood, Highmark’s director of data-center infrastructure. “But in 2010, we’re expected to see rate increases of 20 to 40 percent from our utility.”

“Spending a little more for energy-efficient servers typically pays off pretty fast when you look at energy costs over three or four years,” says Christopher Mines, a Forrester analyst. “Whereas the hybrid car I bought is going to take 12 years to pay me back for that price premium.”

The recession has made it more difficult for companies to invest in multimillion-dollar data centers, but there are alternatives that make the economics work quicker. In data centers, server virtualization, changing the layout of devices, and using more heat-resistant hardware can cut the number of servers needed and reduce cooling costs. “These engagements are typically $20,000 to $50,000,” says Steve Sams, IBM’s resident green I.T. expert. “The average reduction in energy consumption is 23 percent, and they’re paid back in energy savings in two years or less.” In offices and branches, savings can be found by turning off computers across the enterprise when not in use, consolidating printers, trying to go paperless, and replacing PCs with “dumb” terminals that use 80 percent less power.

Tech manufacturers are heeding the call. In Microsoft’s Vista operating system, the company added 30 new power-management features, including an improved “sleep” mode that can save customers an estimated $50 a year in energy cost for every PC.

Computer servers are now being designed more heat-resistant so that data centers don’t have to be kept so cold that “you could store frozen dinners there,” according to Bill Kosik, HP’s director of energy and sustainability for critical facility services. HP and IBM, among other I.T. firms, now make a business of conducting “energy analyses” for clients and offer up “green” plans for I.T.

That’s what got Citi started. “Nobody can turn on a dime like the I.T. industry,” says Citi’s Glass. “When it comes to introducing more efficient hardware, it doesn’t take them as long as the auto industry to come out with a more energy-efficient design.”

VIRTUALALISATION

Virtualization AND GREEN IT

Many businesses are looking for convenient and cost-effective ways to go green and reduce their impact on the planet. Virtualization is a popular solution that can have a positive impact on the environment and your bottom line.

According to the federal Department of Energy, data centers use as much as 3% of American electricity or about 120 billion kilowatt hours per year. This adds up to approximately $7.4 billion. With data creation increasing rapidly, it’s important to consider improved options for storage space and support.

“As servers guzzle up resources at business, you’d think that executives would look for a more efficient way. However, I find that many organizations are running servers at about 5% capacity. Through virtualization, we can help these companies get that rate up to 80% or higher,” said NetGain Technologies Director of Storage and Virtualization Bryan Jackson.

In addition to increased efficiency, Jackson noted that a virtualized environment also helps reduce server sprawl, cut maintenance costs and reduce the overall energy consumption of their facility.

In fact, virtualization can reduce data center energy costs by up to 80%, according to VMware, a global leader in virtualization and cloud infrastructure. And this type of consolidation can improve IT capacity through improving server utilization by running fewer, highly-utilized servers – freeing up power and space.

This is critical because big data is predicted to keep growing. The IDC recently forecasted that the big data service market will expand at a 31.7% compound annual growth rate – that’s about seven times faster than overall I.T. market, which is developing rapidly in its own right.

“I tell clients all the time that data is immortal. With all of our backups and archives, it never goes away. Think about it: when is the last time you really deleted something? Even when you think you’ve deleted something, you really didn’t; but that’s another story,” explained Jackson. “It’s critical that businesses explore their options when it comes to data storage. Virtualization could be a solution for your business and has some environmental benefits too.”

Bryan Jackson and NetGain Technologies have a bit of a green thumb when it comes to virtualization. They help companies reduce costs while benefiting the environment, helping to make Earth Day a little greener.

About NetGain Technologies:
NetGain Technologies is a leading provider in the design, procurement, implementation and management of high-performance IT solutions. With services ranked among the best in the world by MSPmentor and CRN Tech Elite, a multi-state regional footprint and almost three decades of experience; we’ve helped over 1,000 unique clients thrive by leveraging our best-in-class service and support programs. Our highly qualified and experienced professionals align our best-in-class support programs to meet  clients’ needs for positive business outcome

10 Green Cloud Computing Hosts To Consider

With the IT industry now accounting for more than ten percent of global electricity consumption, and data centres alone accounting for almost two percent, the pressure is on businesses to use the greenest possible cloud computing providers.  Here we look the green credentials of ten of the most environmentally friendly hosts available in no particular order:

210x150-green-mountain

1. EVRY

This Norwegian hosting company is one of the largest in Scandinavia. It is also arguably the most environmentally friendly thanks to its use of the Green Mountain Data Centre. The data centre is the self-styled greenest data centre in the world, using hydropower to produce the required electricity and icy fjord water to cool the servers.

2. GreenQloud

Headquartered in Iceland, GreenQloud offers cloud computing services from data centres that are powered by 100 percent renewable energy sources – specifically hydropower and geothermal energy. Iceland’s geographic location also aids the company’s green credentials, with a year-round cold climate offering a natural coolant, and its mid-Atlantic position removing the need for multiple data mirrors. After being founded in 2010 the company has expanded to offer server hosting, online storage, backup, and cloud computing.

3. Google Compute Engine

Google’s data centres already use fifty percent less energy than a typical data centre by reducing their overhead energy (cooling, power conversion, etc.) usage to just twelve percent. Not only has the company made their own processes environmentally friendly, they also share information and best practices in a bid to improve the entire IT industry.

4. CloudSigma

Based out of Zurich in Switzerland, CloudSigma is plugged into one of the greenest electricity grids in the world – over 95 percent of Swiss energy is generated from nuclear and renewable sources. The company only uses certified carbon neutral cloud servers and has been recognised by Greenpeace for its dedication to environmental responsibility within the IT industry.

5. Dediserve

This Irish company has been pushing the green agenda since their founding in 2009. With seven data centres around the world they primarily offer hosting solutions to the European and North American markets. Their virtual servers consume less than 5 per cent of the power of a physical server, meaning just one Dediserve server rack is currently the same as 25 conventional server racks.

6. Windows Azure

Windows Azure is the Microsoft-powered cloud service. The company has been recognized as the second-largest green power purchaser on the United States Environmental Protection Agency’s Green Power Partnership list, and clients who use their data centres will see a 30 percent drop in their carbon footprint.

7. Apple iCloud

Perfect for individuals and small businesses, Apple’s iCloud uses some of the greenest data centres on the planet. The company uses 100 percent renewable energy across all its data locations, plus 75 percent at its corporate facilities. Its new North Carolina data centre is powered by a huge solar panel farm and even occasionally becomes a net power producer for a local utility company.

8. IBM SmartCloud

IBM Smartcloud offers a fully managed, highly secure IaaS cloud which is optimized for critical enterprise workloads. In 2012 27 IBM data centres were awarded ‘Participants in Data Centre Efficiency’ by the European Commission – the largest portfolio of data centres from a single company to receive the recognition.

9. Akamai

The US-based content delivery network is one of the greenest companies in its sector. For more than ten years Akamai have taken a leading role in minimizing the environmental impact of IT systems. With several initiatives in place that are continually improving efficiency, the company now boast a grade ‘A’ for Energy Transparency from Greenpeace.

10. Rackspace

Rackspace is one of the leading names in cloud hosting, yet they also have an excellent sustainability program. In 2014 their USA-based operations were named as ‘Green Power Partners’ by the Environmental Protection Agency for the second consecutive year, while their UK data centres and offices run on 100 percent renewable energy.

5 pioneering paths for software development’s new frontier

How forward-thinking developers are beating the old-guard in emerging application markets

Size (and mobility) matters. As desktop PCs lose ground to tablets and smartphones, and the cloud becomes a more mainstream means for software deployment, desktop applications are being elbowed aside by mobile apps and Web services, resulting in a significant shift in the way software is created.

Software development organizations large and small are moving quickly to adopt new tools and new paradigms, adapting existing tool sets, talent pools, and processes to make the most of new computing environments and emerging opportunities.

Gone are the days of bits being passed from one isolated team to another in service of the one true build. Modern app development requires a nimble, cross-functional approach to rapid deployment.

Here’s how several leading-edge development shops are meeting the challenges of this new frontier.

1. Mobile- and service-first development: Tomorrow’s wave — or at least today’s
From the outside, the one development trend fast becoming evident to everyone — end-user, customer, and developer alike — is the emphasis on mobile-first development, along with all the complexity that entails.

“Apps are huge now,” says Matt Powers, CTO of Applico, a developer of mobile and Web apps for a variety of big clients such as Google, Asics, AT&T, National Oceanic and Atmospheric Administration, and the Mayo Clinic. “They used to run locally off the device, so the infrastructure to support them was small. Now you have people bringing their entire brand, and everything they do on the website, if they have one, to mobile.”

If that infrastructure is deployed as SaaS (software as a service), it requires development practices that are orders of magnitude more rigorous and complex than those used for deploying stand-alone apps.

Intuit, the creators of QuickBooks and GoPayment, learned this lesson when the company broadened its services to meet an international market of 1.3 million users — covering 150 countries, 143 currencies, and 46 languages.

“We needed to scale the development from small teams of 10 to 15 engineers to a team of 70-plus engineers,” explained Anshu Verma, architect at Intuit for QuickBooks Online. Engineers needed to develop faster overall, and thus adopted what he calls a “safety-first design pattern,” which draws inspiration from how circuit breakers work. This design allows old and new workflows to co-exist — an important feature for a Web service. “In case of exigencies, we fall back to the old [workflow] on-the-fly and cut over to the new [workflow] when we are confident about it. It really helped us move faster without impacting customers.”

Intuit’s own development cycle for new features is now a mere two to four weeks, which requires them to use Scrum processes and test automation. “We use tools like Jira, Greenhopper, and Rally to facilitate the iterations across all key stakeholders — product management, development, QA, deployment — and created testing frameworks using modern tools like WebDriver, PhantomJS, JSUnit, DOH [Dojo Objective Harness], JUnit, and so on.”

On the other hand, changing practices for the sake of going mobile doesn’t make much sense unless it’s backed with sound business strategy. Such is Joel Semeniuk’s belief; he has been a project manager for more than 20 years and is now executive vice president of agile project management at Telerik, which creates a broad palette of software testing tools and UI components for .Net.

“Since mobile is all the rage,” Semeniuk says, “it’s too easy to get caught up in the mobile-first strategy and not stop to consider the real customer value provided by the mobile application.” Such value includes creating the right tool for the needed job: “Some applications aren’t well-suited with mobile scenarios — for example, those that require large amounts of data entry.”

2. Find the right mix of development methodologies to meet project needs
The wars over development methodologies — agile, XP (extreme programming), waterfall, and so on — are fast giving way to a more fluid and flexible approach to producing and refining a product. Telerik’s Semeniuk is one of many in the modern development world who sees development methodologies not as dogmas to be followed to the letter, but toolkits to be raided for what’s useful. Confining a development team to one methodology is becoming a thing of the past.

“We have an iteration pattern for each problem,” says Semeniuk, “in which we continually adjust or ‘pull in’ new agile practices that solve those problems. Sometimes we ‘pull in’ all of Scrum, because it solves the wide range of problems that come up in that particular environment. Sometimes we pull in pieces of Scrum or XP or Combine because it makes better sense if you’re in maintenance mode.” Semeniuk calls this the “agile buffet table” model.

For Telerik, though, the most important motive behind using any particular development practice is why. “We like to start with the ‘why’ and use agile to solve a real problem we’re having. The biggest reason for that is when we try to just push out practices, they don’t stick; people don’t identify with the reasons these practices make sense. And not all practices fit all projects,” he says.

Applico’s Powers says his company also uses a variety of development models — mainly agile and iterative. In his case, the “why” is driven by client needs.

“Some clients like rigid development timelines and documentation,” says Powers, “especially ones that want to bring it in house. Others like the fluidity of the agile process and the ability to be brought in the loop at all times.”

Some, however, caution that agile can’t simply be sprayed onto an existing development process. A former program manager who has declined to be named but has five years of experience as a Scrum master has time and again seen agile used in development, but with no corresponding changes in other facets of bringing software to market.

“There’s no intermittent QA; instead, there’s old-school ‘toss it over the wall to QA’-style QA,” he says. “Instead of regular releases, they’re using agile to get a release out, then having the schedule disrupted by support.” In his purview, there has been a battle between traditional software releases and agile, with a lot of people simply using agile merely to drive old-school models.

3. Go with shorter lifecycles, cross-functional teams
The “mobile first” philosophy of modern development has also changed application lifecycle management in striking ways.

cross functional teams

cross functional teams

“Referring to a ‘shorter development cycle’ is misleading for Web development,” says Andrew Frankel, former VP of engineering at TopShelfClothes.com. “It’s no longer necessary to actually complete a full develop-QA-release cycle for every change. Small changes, such as changing text, can skip the usual process, since they can be deployed without any user impact. That frees the QA team to focus on testing actual application changes.” Mobile and desktop app developers, he adds, aren’t as lucky, since every change requires a new version.

For Telerik’s Semeniuk, the biggest changes to application management are in Web and mobile. For those areas, he says, “You absolutely need short release cycles, because it’s very difficult to pinpoint true customer value and interaction without actually measuring it.”

This means getting items into customers’ hands fast via a solid automation and deployment mode. “This has triggered a new flavor of app management called devops, where the dev team and the ops team need to work closely together to make sure that, as feedback is required, they can get that software into the hands of users without a lot of pain,” he says.

Semeniuk also feels that, for larger organizations, overall team composition isn’t shifting as quickly as it could to react to these changes: “Teams [in smaller organizations] have been shifting from functional roles — business analyst teams, testing teams, deployment teams, etc. — to cross-functional teams, where all the skills to envision, build, and deploy an application are on a single team. Teams then work together as a whole to deliver that software instead of handing it off between functional teams.”

Some enterprises have a hard time making this shift to cross-functional roles, but Semeniuk believes this will change when “organizations can realize that an HR structure does not need to dictate a team structure.”

4. Inventive use of the standard development toolkit
Modern development teams are extending the mantel of ingenuity right down to the tools they use, employing popular development tools in new ways to spur further innovation in the development process. Consider Git, the open source revision control system — which can be used for much more than its primary purpose. For Andrew Frankel, former VP of engineering at TopShelfClothes.com, Git was also a way to perform process automation.

“Driving deployments with Git is fantastic for release management,” Frankel says. “We have a complete log of what changed, at what time, and for what reasons. Larger organizations often try to collect exactly those data points using formal change requests, which tend to be frictional in a fast-paced environment. It’s much more efficient to create a process where that information is collected automatically.”

At Telerik, Git was adopted by one team as an escape hatch from the usual in-house development methodology.

“We have one division that has chosen not to use our development infrastructure, which is primarily Microsoft Team Foundation-based,” says Telrik’s Semeniuk. “They decided to do something that better fit their culture, experience, and needs, and started off with Git. A whole different form of release management with different tools, but it fit their culture and the experience of their team members a whole lot better.” The team in question might not have to choose between Microsoft’s workflow and git before long, though; Microsoft recently added Git support to Visual Studio and Team Foundation Server.

Git has also been put to use to support other parts of development such as documentation. The Gitit wiki system uses Git (or another version control system) to track and preserve changes to a community-created set of documents.

It should also come as no surprise that the cloud figures into most everyone’s work as a cutting-edge software development tool. But it’s not just a place to host code or a site — it’s also being eyed as a testing framework. Applico, in particular, is developing a cloud-based foundation for automated testing of its apps.

“With Android especially, you have an international market with the product running on over 500 devices,” Applico’s Powers explains. “If you can integrate this into a system where you can simulate all these different device types, you’re going to catch a lot of issues before you go to market.” To that end, Applico has been looking at a few vendors to provide tools to take the company’s application builds, host them in the cloud, and perform the emulation there.

This approach seems like an attempt to refute what Sebastian Holst has claimed in “The Rise of Application Analytics: A New Game Demands New Rules.” There, Holst states, “You cannot simulate production,” meaning “the diversity and distribution of on-premises and cloud-based services combined with the dizzying array of devices and client-runtimes makes comprehensive testing and profiling prior to production not just difficult, but impossible.”

To Holst, the solution lies in application analytics: real-time harvesting of data from application behaviors as per Telerik’s work. Applico’s idea is to expand the way we perform and automate testing — not to displace analytics as a test methodology, but to use the cloud as a way to reduce the burden of testing.

Two of the most widely used tools for automation, Puppet and Chef, are also being used in creative ways.

“Using Chef and cloud servers for manual testing is fantastic,” says Frankel, “since those servers will only be used occasionally for a few hours. When we’re done testing, we turn off the lights and avoid paying for idle capacity. It only takes a single command to re-create a new staging environment the next time we want to test.” The same process is also possible with Puppet.

Puppet vs. Chef at a glance
Puppet Chef
Language Mainly Puppet’s custom JSON-like language, although a Ruby option is available beginning in version 2.6 A subset of Ruby
License Apache; earlier versions are GPL Apache
Approach You list dependencies and Puppet figures out how to order the install. You write an install script in Ruby using all of the extra helper functions from Chef.
Basic version $99 per node per year (annual term license) with the first 10 nodes free; discounts kick in for larger installations $120 per month for 20 nodes or $300 per month for 50 nodes
Premium version The Premium version of Puppet Enterprise is priced by the sales team. $600 per month for 100 nodes
Deployment Puppet Enterprise runs on your machine. Private Chef runs on your machine; Hosted Chef (same price) runs in Opscode’s cloud.

5. HTML5 — a handy, albeit hyped, solution for increasing device fragmentation
Given the current focus on mobile-first development, a great deal of attention is being paid to HTML5 and what role it will play. On the one hand, developers are quickly jumping into HTML5, because not doing so would be self-defeating. On the other hand, HTML5 is clearly no cure-all.

Applico’s Powers takes a dim view of HTML5 as a mobile platform.

“HTML5 will never catch up to native development,” he insists. “If you think of running everything in a Web view, you’re just abstracting a layer between yourself and the native code. It’s always going to be a step behind, and as new versions of the OSes come out, tools like PhoneGap and Titanium have to react to those changes.”

In his opinion, HTML5 is best used for enterprise apps, such as a data-submission form, not immersive-experience apps.

Powers described experiences in his work that shed further light on this. Applico competitors lured clients away from Applico, offering to build apps with HTML5 at half the cost Applico quoted. “Eight months later, those clients would come back to us and say, ‘We made the wrong decision; we went with someone that promised us the world and didn’t really understand the limitations of the technologies.'”

Last year, Hung LeHong and Jackie Fenn, both of Gartner, placed HTML5 at the “peak of inflated expectations” on Gartner’s annual Hype Cycle Report, estimating it would be five to 10 years before the real plateau for the standard could be reached. Yet many developers are embracing HTML5 and find Gartner’s analysis to be way off-base.

Kendo UI, a division of Telerik, performed its own studies and found that 82 percent of developers “find HTML5 important to their job within the next 12 months,” with 31 percent planning to use it and 63 percent actively developing in it.

That said, the phrasing of these questions doesn’t speak to developer preferences, only to what developers are doing — that is, building HTML5 apps because it’s part of their job description. What’s more, another survey sponsored by Appcelerator and IDC for 2012 found that most of the mobile developers surveyed were “neutral to dissatisfied with HTML5” in several categories, including performance (72.4 percent of those surveyed), fragmentation (75.4 percent), and user experience (62 percent). This is striking in light of how an earlier survey by the same group asked developers, “Do you plan to integrate HTML5 as a component into the mobile apps you plan to build in 2012?” — to which 79 percent answered yes.

Todd Anglin, vice president for HTML5 Web and mobile Tools at Telerik, questioned this conclusion, and not just because of the rapid development of HTML5 on all sides: “Developers should note that the new ‘native’ Facebook apps still include HTML5 in sections where Facebook wants the ability to change things more quickly,” Anglin wrote, referencing the much discussed shift Facebook undertook in 2012 to native mobile apps due to shortcomings it experienced with HTML5.

In short, for now HTML5 may be best thought of as merely one ingredient in an application’s overall composition, rather than the way to create an app.

Conclusions
With so much software produced now aimed at a mobile or service-oriented market, development techniques are evolving to suit. Desktop programs that went for years between major revisions are being supplanted by mobile apps that are point-revved every few months or by services that are revved continually behind the scenes.

The demands those changes make are major, but they’ve also spurred numerous creative new solutions, including new use cases for traditional tools and the cloud as a development and testing platform, rather than just a software delivery mechanism.

The increasing speed of development (and developer feedback) means new technologies — witness HTML5 — are getting field-tested and absorbed into the mix more quickly, hastening the pace of relevancy.

As always, though, application development isn’t about a particular paradigm, tool, or methodology — it’s about what works, here and now.

Cloud computing ways it will change by 2020

Right now we are in the early days of cloud computing, with many organisations taking their first, tentative steps. But by 2020 cloud is going to be a major — and permanent — part of the enterprise computing infrastructure.

Eight years from now we are likely to see low-power processors crunching many workloads in the cloud, housed in highly automated datacentres and supporting massively federated, scalable software architecture.

Cloud 2020

Analyst group Forrester expects the global cloud computing market will grow from $35bn (£22.5bn) in 2011 to around $150bn by 2020 as it becomes key to many organizations’ IT infrastructures.

Alongside this increase in demand from enterprise, there will be development in the technologies that support clouds, with rapid increases in processing power making cloud projects even cheaper, while technologies currently limited to supercomputing will make it into the mainstream.

And of course, by 2020, a generational shift will have occurred in organisations that means a new generation of CIOs will be in charge who have grown up using cloud-based tools, making them far more willing to adopt cloud on an enterprise scale.

1. Software floats away from hardware

John Manley, director of HP’s Automated Infrastructure Lab, argues that software will become divorced from hardware, with more and more technologies consumed as a service: “Cloud computing is the final means by which computing becomes invisible,” he says.

As a result, by 2020, if you were to ask a CIO to draw a map of their infrastructure, they would not be able to, says David Merrill, chief economist of Hitachi Data Systems. “He will be able to say ‘here are my partner providers’,” he says, but he will not be able to draw a diagram of his infrastructure.

This will be because it will be in a “highly abstracted space”, where software is written in such a way that it goes through several filters before it interacts with hardware. This means that front-end applications, or applications built on top of a platform-as-a-service, will be hardware agnostic.

2. Modular software

To take advantage of the huge armadas of hardware available via clouds, individual software applications are set to get larger and more complex as they are written to take advantage of scale.

With the growth in the size and complexity of individual programs, the software development process will place an emphasis on modular software — as in, large applications with components that can be modified without shutting down the program.

As a consequence, cloud applications will require a new programming mindset, especially as they interact with multiple clouds.

“Software has to be thought about differently,” HP’s Manley says, arguing that the management of federated services will be one of the main 2020 challenges. This is because applications are not only going to be based in the cloud, but will hook into other clouds and various on-premise applications as well.

In other words, different parts of applications will “float around” in and out of service providers. Assuring good service-level agreements for these complex software packages will be a challenge, Manley says.

3. Social software

Along with the modular shift, software could take on traits currently found in social-media applications like Facebook, says Merrill. Programs could form automatic, if fleeting, associations with bits of hardware and software according to their needs.

“It will be a social-media evolution,” Merrill says. “You will have an infrastructure. It’ll look like a cloud, but we will engineer these things so that a database will ‘like’ a server, [or] will ‘like’ a storage array.”

In other words, the infrastructure and software of a datacentre will mould itself around the task required, rather than the other way around. Developers will no longer need to worry about provisioning storage, a server and a switch, Merrill says: all of this will happen automatically.

4. Commodity hardware rules

By 2020 the transition to low-cost hardware will be in full swing as schemes such as the Open Compute Project find their way out of the datacentres of Facebook and Amazon Web Services and into facilities operated by other, smaller companies as well. “Servers and storage devices will look like replaceable sleds,” says Frank Frankovsky, Facebook’s VP of hardware design and supply chain, and chairman of the Open Compute Project.

“Cloud computing is the final means by which computing becomes invisible” — John Manley, HP

By breaking infrastructure down into its basic components, replacements and upgrades can be done quickly, he says. The companies best placed to use this form of commoditised infrastructure are large businesses that operate huge datacentres. “I would say that between now and 2020, the fastest-growing sector of the market is going to be cloud service providers,” Frankovsky says.

5. Low-power processors and cheaper clouds

We’re around a year away from low-power ARM chips coming to market with a 64-bit capability, and once that happens uptake should accelerate, as enterprise software will be developed for the RISC chips, allowing companies to use the power-thrifty processors in their datacentres, and thereby cut their electricity bills by an order of magnitude.

HP has created a pilot server platform — Redstone — as part of its Project Moonshot scheme to try to get ARM kit to its customers, while Dell has been selling custom ARM-based servers to huge cloud customers via its Data Center Solutions group for years.

By 2020 it’s likely that low-power chips will be everywhere. And it won’t just be ARM — Intel, aware of the threat, is working hard on driving down the power used by its Atom chips, though most efforts in this area are targeted at mobile devices rather than servers. Facebook thinks ARM adoption is going to start in storage equipment, then broaden to servers.

“I really do think it’s going to have a dramatic impact on the amount of useful work, per dollar, you can get done,” Frankovsky says. This should help cloud providers, such as Amazon Web Services, cut their electricity bills. Moreover, if they are caught in a price war with competitors, they are more likely to pass on at least a chunk of the savings to developers, in the form of price reductions.

6. Faster interconnects

The twinned needs of massively distributed applications and a rise in the core count of high-end processors will converge to…..bring super-fast interconnects into the datacentre.

Joseph Reger, chief technology officer of Fujitsu Technology Solutions, predicts that by 2020 we can expect communications in the datacentre to be “running at a speed in the low hundreds of gigabits per second”.

Reger says he expects that there will be a “very rapid commodification” of high-end interconnect technologies, leading to a very cheap, very high-performance interconnect. This will let information be passed around datacentres at a greater rate than before, and at a lower cost, letting companies create larger applications that circulate more data through their hardware (known in the industry as ‘chatty’ apps), potentially allowing developers to build more intelligent, automated and complex programs.

7. Datacentres become ecosystems

Cloud datacentres will “become much like a breathing and living organism with different states”, Reger says. The twinned technologies of abstracted software and commodified hardware should combine to make datacentres function much more like ecosystems, with an over-arching system ruling equipment via software, with hardware controlled from a single point, but growing and shrinking according to workloads.

Datacentre
Cloud datacentres will “become much like a breathing and living organism with different states”.

Automation of basic tasks, such as patching and updating equipment, will mean the datacentre “will become more like a biological system” he says, in the sense that changes and corrections are automatically made.

8. Clouds consolidate

The internet rewards scale, and with the huge capital costs associated with running clouds, it seems likely that there will be a degree of consolidation in the cloud provider market.

Fierce competition between a few large providers could be a good thing, as it would still drive each of them to experiment with radical technologies. For example, in a bid to cut its internal networking costs and boost utilisation, Google has recently moved its entire internal network to the software-defined networking OpenFlow standard, which looks set to shake up the industry as more people adopt it.

Manley of HP argues there will be a variety of clouds that will be suited to specific purposes. “There’s going to be diversity,” he says. “I think you would only end up with a monopoly if there was an infrastructure around that was sufficiently capable to meet all the non-functional [infrastructure requirements] of those end services.”

9. The generational shift

By 2020, a new generation of CIOs will have come into companies, and they will have been raised in a cloudy as-a-service world. There will be an expectation that things are available “as-a-service”, Merrill says: “Our consumption model is changing as a generational issue.”

And this new generation may lead to a shake-up in how businesses bill themselves for IT, Merrill says. “We have these archaic, tax-based, accounting-based rules that are prohibiting innovation,” he adds.

10. Clouds will stratify

Today clouds are differentiated by whether they provide infrastructure-as-a-service, platform-as-a-service or software-as-a-service capabilities, but by 2020 more specialised clouds will have emerged.

According to Forrester, we can expect things like ‘middle virtualisation tools’ and ‘dynamic BPO services’ to appear by 2020, along with a host of other inelegant acronyms. In other words, along with some large providers offering basic technologies like storage and compute, there will also be a broad ecosystem of more specific cloud providers, allowing companies to shift workloads to the cloud that would otherwise be dealt with by very specific (and typically very expensive) on-premise applications.

Merrill says clouds will, like any utility, be differentiated by their infrastructure capabilities into a whole new set of classes. “Just as we have power generation from coal, from natural gas, nuclear, hydroelectric, there will be differences,” he says. “The economics, in my opinion, help us with differentiation and categorisation.”