Saturday, November 7, 2009

Global Manufacturing and Consumption

Another significant shift in the manufacturing industry is that product development and production have been widely distributed. It’s not a surprise to find “Designed by Apple in California. Assembled in China” on the back of an iPod owned by a 16-year-old boy in Spain. Production offshoring and global marketing give companies opportunities to cut costs and to reach more consumers, but these activities also require more collaboration with up- and down-stream partners. Product data transparency between a manufacturer and its suppliers (or in other words, consistent BOM information throughout its supply chain) becomes an important issue when companies want less expensive production resources but still need to keep up with the pace of shortening time-to-market. In an old-fashioned way, an engineering change that reflects material changes may reach suppliers in days. Not to say that suppliers may also have a few layers of suppliers.

Consistent BOM throughout the whole supply chain relies on integration. First of all, internal integration ties all the information systems running within an organization (PDM/PLM, ERP, SCM, etc.) that rely on accurate BOM data. This integration allows companies to have effective and consistent product information any time it is needed. Secondly, external integration connects all parties on the value chain. Based on electronic data interchange (EDI) or other means of data exchange, external integration allows enterprises to have a common view of the product structure and other critical data, so companies can collaborate across organizational borders.

Why Managing BOM Is Such a Big Task

In the discrete manufacturing sector, the bill of materials (BOM) is a fundamental piece of product data that exists throughout the major stages of a product’s life cycle. According to Wikipedia, BOM is the term used to describe the raw materials, parts, subcomponents, and components needed to manufacture a finished product. Simply speaking, BOM is just a list of all materials needed to be assembled together into a product. The concept is clear and simple, and it doesn’t seem to be a difficult task to manage BOM, especially when we have a powerful tool—software—in hand. However, this is true only when the product structure is so simple that not much collaboration is needed to develop the product, when consumers are delighted to have the same products that everyone else has, and when design, engineering, and production are performed under the same roof. The truth is, during the past few decades, the landscape of the manufacturing sector has changed dramatically, and it is still changing at a rapid pace.

Collaborative Product Development

As time moves on, products become not only more complicated in structure, but also impossible to develop exclusively by a single department. In fact, developing a product is now a corporate-wide activity that involves almost every function of a company, from strategic planning, to sales and marketing, to after-sales services.

To see how things get more complicated, we don’t even need to look at all the participants. Let’s stay with three functions—product design, engineering design, and production—for a while. At the time when the product design department finishes its work, a design BOM will be generated. Ideally, this BOM will be carried throughout subsequent processes. However, this is not very likely to happen. For example, a single part created by product design team might be modified into two parts by the engineering design team for the feasibility of production; when the production team receives the production order, it might decide to use another material (which also meets the requirements) to produce the parts, since there is a large amount of this material in the stock due to a cancelled order.

The differences among the design BOM, engineering BOM, and production BOM create inconsistency of product data along the product’s life cycle, and sometimes increase product cost and time-to-market. Besides these three types of BOM, there are also customer BOM, sales BOM, maintenance BOM, cost BOM, etc., all used for different purposes, making things even more complicated. One way to resolve this problem is to bridge the information gaps on a constant basis under the change management mechanism, which is a fundamental functionality within the product lifecycle management (PLM) solution.

Mass Customization

To meet the increasing demands of consumers that want more personalized products without significant increases in price, many manufacturers now practice mass customization of products ranging from automobiles to computers—even apparel. Modular BOM is one of the enablers for mass customization. It defines the components needed to produce a subassembly, and provides cost information for each component and “rolled-up” cost for the overall subassembly. Nowadays, one product may many configurations. If computer systems store each possible configuration as an independent BOM, BOM maintenance becomes almost impossible.

Configurable BOM is another enabler for mass customization. By using this BOM, buyers and manufacturers can create “end-items” dynamically. Based on this configurability, Quote-to-order (Q2O) solutions (sometimes known as configure, price, and quote, or CPQ) enable manufacturers to mobilize their mass customization initiatives. These systems can reduce time-consuming quoting and ordering processes, decrease unit costs, and lower sales costs.

Product Review:Patch Deployment

Once the initial scan is complete, you will probably want to deploy any missing patches or service packs. To do so, go to the Security Scanner container at the top of the user interface and then right click on the computer that you want to update. You will have the option of deploying the patches onto the selected computer or onto all computers. LANguard will send the users a message before the deployment process begins and will stop any necessary services on the user's machines.

Earlier I mentioned that one of the big drawbacks to Microsoft's SUS is that there are a limited number of Microsoft products that it can manage patches for. This is not the case with GFI LANguard though. GFI LANguard can handle patch management for all Microsoft server products, operating systems, and even for Microsoft Office. It even has the ability to deploy patches for non-Microsoft products (although the need for such patches is not automatically detected). Although GFI LANguard is clearly superior to SUS, GFI recommends using GFI LANguard as a compliment to SUS rather than as an alternative to it. In fact, GFI has published a white paper that details the specifics of using SUS and GFI LANguard together. You can read this white paper at www.gfi.com/whitepapers/patch-management.pdf.

Another reason why using GFI LANguard in conjunction to SUS is an ideal patch management solution is because of the timeliness of patch deployment. You probably remember the SQL Slammer virus, which exploited a hole in SQL Server. A patch was available from Microsoft very soon after the virus first appeared and yet millions were affected with the virus because they did not patch SQL quickly enough. GFI LANguard allows you to deploy patches immediately to all of your computers. You also have the option of scheduling both scans and patch deployments. Additionally, you have the option of setting up various types of alerts. That way if a security scan detects a critical vulnerability you can be notified immediately so that you can take action.

Product Review: Security Scanning

GFI LANguard is much more than a patch management product though. Any patch management solution will scan your network for missing patches. GFI LANguard raises the bar by also scanning the network for other types of potential security vulnerabilities.

The nice part about this feature is that you don't have to do any extra work to perform a full-blown security scan against your network. When you scan your network for missing patches, GFI LANguard will also check for things like open shares, open ports, and unused user accounts. The software also checks for security vulnerabilities related to audit policies, password policies, user accounts, groups, and computers.

When the scan is complete, GFI LANguard offers a dozen different reports that you can view. Many of these reports pertain specifically to security vulnerabilities that have been detected. Best of all, reports exist that focus solely on specific types of vulnerabilities. For example, you can choose to look at only the most serious security vulnerabilities, or to look only at vulnerabilities pertaining to your password policies.

Available Patch Management Solutions

There are many patch management tools available from Microsoft and from third party software vendors. Microsoft's two primary patch management solutions are SMS Server and the Software Update Service (SUS). Both are good solutions, but have their limitations. SMS Server is a comprehensive patch management solution, but has a hefty price tag and a steep learning curve. SUS is a free patch management utility that is easy to use, but it has some major limitations. SUS cannot deploy patches related to Microsoft SQL Server, Microsoft Exchange Server, or Microsoft Office. Furthermore, SUS cannot deploy patches to machines that are running Windows NT.

These various limitations mean that SUS and SMS Server simply aren't good fits for many organizations. As an alternative to these two products, many companies are turning to third party patch management solutions. One particular patch management solution that I really like is GFI's LANguard Network Security Scanner. Although GFI's LANguard has been around for a while, GFI has recently released version 5.

Product Review: GFI's LANguard Network Security Scanner

Without a doubt, one of the most tedious chores that network administrators must routinely perform is patch management. Hardly a week goes by that Microsoft doesn't release some sort of patch. It is the network administrator's responsibility to download the latest patches and apply them to all of the organization's computers. As tedious as patch management is though, it is one chore that really shouldn't be neglected. Not only do the various patches resolve security vulnerabilities, once a patch is released the specific vulnerability addressed by the patch is made public, making the vulnerability much more likely to be exploited on un-patched machines.

Monday, October 26, 2009

Microsoft’s Underlying Platform Parts for Enterprise Applications: Somewhat Explained – Part 1

I can partly understand analysts’ temptation to beat up on Microsoft’s forays into the enterprise applications space. To be fair, ”the empire” has had its share of strategic and tactical miscues, as if it had wanted to give these naysayers some ammunition. For one, many analysts and market observers first criticized the giant for not having a unified enterprise resource planning (ERP) product line, but rather several diverse ones, coming from acquisitions of former Great Plains Software and Navision Software a/s.

Today, we are talking about the following four Microsoft Dynamics ERP product lines:

1. Microsoft Dynamics GP (formerly Great Plains) [evaluate this product];
2. Microsoft Dynamics NAV (formerly Navision) [evaluate this product];
3. Microsoft Dynamics SL (formerly Solomon) [evaluate this product] ; and
4. Microsoft Dynamics AX (formerly Axapta) [evaluate this product] .

Additionally, in the early 2000s, Microsoft developed its own customer relationship management (CRM) product, today called Microsoft Dynamics CRM [evaluate this product]. Last but not least, the Dynamics product family includes two retail management applications, which are a combination of an acquisition and in-house development. These are:

* Microsoft Dynamics RMS (Retail Management System) (formerly QuickSell); and
* Microsoft Dynamics POS (Point of Sale) [evaluate this product].

Good and Bad “Green”

Perhaps as a knee-jerk reaction or a temptation (who on earth wouldn’t be tempted to manage a single code base and technology set instead of multiple ones?), in the mid 2000s, Microsoft espoused the ill-fated “Project Green.” When the serious concerns of existing individual ERP products’ users, partners, and even Microsoft developers surfaced about the products’ full-blown convergence into a single “uber-product” (a la SAP), the project was first backpedaled, reduced in scope, phased (via Wave 1, Wave 2, etc.), and than fully scrapped.

It was a waste of a good name at an inopportune time, given the public’s infatuation today with anything that is environmentally-friendly, i.e., “green.” In any case, with the Project Green frenzy subsided, the analysts can always resort to picking apart the Microsoft Business Solution’s (MBS, part of Microsoft Business Division) profit and loss (P&L) statement.

In other words, is it growing faster than the market and is it profitable? Well, having recently attended some of Microsoft’s applications user conference (Convergence 2008) and partner events (Worldwide Partner Conference [WWPC] 2008), at least I think I got some answers and clarifications for myself to both the Project Green and P&L issues.

Once and for all, Microsoft has given up on the lofty idea of building an entirely new ERP system to replace all existing code lines. The daunting “new code line from scratch” effort was admittedly stopped a year ago or so for lack of platform readiness to realize vision (since it was too hard to catch up on the applications breadth and depth), and because it would cause immediate channel disruptions in all ERP product lines.

In other words, each individual product still has its own areas of strengths in terms of vertical industry fit, partners’ geographic coverage, and so on. The more appealing direction was thus to “incubate the vision for a business application suite for the future.” A successful incubation is based on sharing the following research & development (R&D) principles across all the products:

* Role-tailored user experience (UX) with embedded and contextual business intelligence (BI);
* Service oriented architecture (SOA);
* Process-centric and model-driven product design (architecture); and
* Continuous delivery through an evolutionary (rather than revolutionary) product roadmap.

Share (if not SharePoint) is the Key Word Here

In other words, rather than converging (or “fusing”) all products into a next-generation one (which Epicor Software might miraculously pull off with the upcoming Epicor 9 product), Microsoft has opted for the following design principles:

* To natively build on the core Microsoft platform, as much as possible;
* To leverage standard Dynamics tools for high productivity, and proprietary platform tools for specialized requirements (with an 80/20 percent or Pareto approach);
* To share technology across the Dynamics portfolio whenever appropriate, whereby the idea is to develop assets shared by design, to develop for one product to then adopt across the portfolio; and
* Information scenarios and user requirements will be driving future platform innovation.

The best example of sharing (synergy) would be with the single UX design team. The role-based user interface (UI) was implemented with shared controls and gadgets, and delivered for all the Dynamics ERP products after introducing it and testing first in Dynamics GP. With such an ability to share future innovation, Microsoft will continue to look for opportunities to innovate in one product and then share (roll out) across the entire portfolio.

The “Better Together” Themes

Given that all of the acquired and developed products have always been on the Microsoft Windows operating system (OS), it is a no-brainer that they will continue to leverage this platform. Some of the potential future benefits of all the Dynamics application products being “better together with Windows” could come from the following: being certified with virtualization capabilities, integration with Windows Essential Business Server (that is suited for mid-size businesses), and from integration with Windows System Resource Management (WSRM) Management Pack.

Microsoft’s Underlying Platform Parts for Enterprise Applications: Somewhat Explained – Part 2

Reporting, Analytics & Collaboration Enablers

To expand further on the use of the Microsoft SQL Server database that was discussed at the end of Part 1, all Microsoft Dynamics reporting capabilities will in the future come natively (which also means without new license fees) through SQL Server Reporting Services (SSRS) and associated tools. This was first developed within Microsoft Dynamics GP 9 and Microsoft Dynamics AX [evaluate this product], and will be adopted more broadly across other Dynamics products. In the Microsoft Dynamics AX 4 release, there was the capability of creating ad hoc reports, whereas most recently released Microsoft Dynamics AX 2009 also uses SSRS for all production reports.

Innovation is now surfacing as a result of integration between the Microsoft Visual Studio.NET (VS.NET) development platform and SQL Server. Namely, there is now the ability to launch Precision Report Designer and maintain the Dynamics AX semantic models in VS.NET and to pass the data in a closed-loop manner to and from Dynamic AX logic models. These models can in turn look into the Dynamics AX database (SQL Server) via database secure views. The future development will make these currently static models dynamic for report-customization purposes.

Along similar lines will be the use of Microsoft SQL Server Analysis Services (SSAS), whereby all Dynamics role centers within the user experience (UX) project (mentioned in Part 1) will feature embedded contextual business intelligence (BI). Currently, Dynamics AX 2009 has the cube generation capability, whereby analytics perspectives have been added to the business logic model, and which can generate Data Source Views (DSV’s) and Online Analytic Processing (OLAP) cubes. The future research and development (R&D) forays will likely enable the round-trip (between VS.NET and SQL Server) advanced features that will require similar features to the abovementioned reporting tools.

As a little caveat, these native reporting and analytics features will not be automatically available to the users of the proprietary Microsoft Dynamics NAV C/Side database (about half of the install base) and Dynamics AX Oracle instances. For Dynamics NAV customers using the older C/Side database, most of them upgrade to SQL Server when they move to a new NAV version anyway, while Dynamics AX users on Oracle can access the new reporting and analytics features by adding SSRS and SSAS to their deployment. Still, Microsoft will, for the foreseeable future, honor the ongoing support for these databases alongside its SQL Server.

Sharing SharePoint and Unified Communications

Microsoft SharePoint is the platform for portal-based collaboration and document management/enterprise content management (ECM). The product also works tightly with Windows Workflow Foundation (WF) and Unified Communications (UC), both Microsoft technologies that will be described in detail later on. This integration provides great visibility for workflows related to documents and document libraries, and improving collaboration through the “presence” and “click to communicate” features.

Today, SharePoint is the universal portal technology for the Dynamics portfolio; for example, in Dynamics AX 2009, the AX Enterprise Portal (formerly Axapta Enterprise Portal) is now based on SharePoint. The portal was devised from the standard SharePoint design experience, whereby a gallery of Dynamics AX Web parts is now available, making it very simple to bring to the surface Dynamics AX data (with the inherent AX security model enforced) on SharePoint portal pages.

In addition to Web parts, other strategies for SharePoint integration are its Business Data Catalog (BDC) Web Services feature (currently used within Microsoft Dynamics GP [evaluate this product] and Dynamics CRM [evaluate this product]), and data binding (within Dynamics AX). It is likely that BDC services will grow further in importance, and we should expect a broad Microsoft Dynamics consistency around this feature.

The abovementioned UC technology provides the ability for applications to identify users’ “presence” and enable “click to communicate” capabilities. Via Microsoft Office Communications Server 2007, Dynamics AX 2009 and Dynamics CRM 4.0 currently work with UC (which is envisioned for the upcoming Dynamics ERP releases too). For example, whenever a user sees a person in the application screen, he/she can also see a presence indicator showing if they are “out of the office”, “in a meeting”, “on a call”, or “available”. By clicking on the indicator, a user gets to pick the preferred method to communicate with them with a single click, whether it might be via email, instant messenger (IM), or phone, if the company has the computer telephony integration (CTI) capabilities.

The Microsoft Dynamics team is working together with the UC team to develop even more advanced scenarios that bring people closer to the processes represented in their applications. One such possible scenario, “Call Center of the Future”, was showed at Convergence 2008 during Steve Ballmer’s keynote speech. Expected scenarios for the next version of UC platform will revolve around how to factor in application embedding, advanced in-context collaboration scenarios, and blending UC and business process management (BPM).

What About the Microsoft .NET Framework Parts?

The situation is much less “crystal clear” when it comes to leveraging components of the Microsoft .NET Framework. Namely, on the programming and development platform side, only Microsoft Dynamics SL [evaluate this product] is leveraging Visual Basic.NET (VB.NET), one of the languages embraced by .NET. Having already abandoned the gut-wrenching route of a single code base, as noted in Part 1, Microsoft now has to live with the proprietary platforms within Dynamics GP (i.e., Dexterity) , Dynamics NAV (i.e., C/Side AL), and Dynamics AX (i.e., X++/MorphX).

But, on the upside, the abovementioned Windows WF technology, which is an application-hosted workflow orchestration engine, and with a VS.NET design experience, is much more pervasively used throughout Dynamics. WF tools are VS.NET-based tools for developers that add simplified analyst (information worker) experiences.

The technology originated in the Microsoft BizTalk Server team (to be described later on), and in a future major release of BizTalk, WF will become the orchestration engine for BizTalk. WF is used in SharePoint and within Dynamics applications (i.e., Dynamics GP 10, Dynamics AX 2009, and Dynamics CRM 4.0) as the workflow engine. A distinct feature is its Tracking Provider architectural design (Dynamics AX 2009 implements this) that allows users to capture process execution information in the same database as the transaction data.

There is the ability here to track and record data about WF instances as they execute, such as the current status of long running processes, time spent across parts of/the whole process, exception paths taken, etc. This enables an analysis like, for example, how much time or how many escalations is it taking the user to approve Purchase Orders (PO) for his/her preferred suppliers with PO’s value under US$25,000.

Furthermore, Windows Communication Foundation (WCF), formerly called Indigo and WinFX, is an application web services inter-communication framework, and can be used to access Dynamics AX 4 and 2009 business logic through web service interfaces. This provides a higher level document interface to the application for integration, complementing the .NET Business Connector which offers more granular, lower level component interfaces to the Dynamics AX business logic. Microsoft’s .NET Business Connector replaces the older Microsoft Component Object Model (COM)-based COM Business Connector.

Personal lessons for living with techno overdose

Are they real needs or just wants? It sounds like a simple platitude, but most of us fail this logic test. Think about why you are willing to spend the money and add the burden of this new technology to your daily regimen, and whether it will really make your life easier, more efficient or even more fun (emotional needs do qualify!). If you're anal enough, then take the time to write your needs down.

Search, Research, and Countersearch.

Technology overload is matched equally by information overload. Unless you are a persevering researcher with lots of time, you need filters to reduce the avalanche of available data down to an examination of the product features that really matter to you (refer back to item #1). The Internet provides a fire-hose worth of information via a simple Google search, but I routinely go to CNET.com as one of the better consumer electronics reviewers with well-organized, distilled information. Or better yet, I often use the Delphi method by consulting with a couple of techno savvy friends who can quickly give me the scoop on the latest and greatest products.

Compare Cost versus Benefit to Compute Value.

Assuming that you could compellingly define why you needed this new technology, compare the cost and overhead to use and maintain your new purchase. Will this purchase really improve my life or just add another burdensome feature? The key objective should be that it provides greater freedom. Figure 2 graphically depicts that combination of price point and product usefulness where value is achieved, which will vary for each of us,

3Pe

A crass plug for ChainLink's methodology you say! No, you can really apply some of it even to your personal circumstances. This new technology may require behavioral changes on your part to realize the benefits, so consider whether you're up for it. Technology is not in itself a solution but only an enabler. How many techno devices are laying around your house unused or underutilized?

Take the Deal!

If you made the decision to buy the core functionality, then sometimes the more troublesome decision may be to determine what other features to take through bundled offerings or up-sells. Don't be a sucker, but there are good deals to be had, so take the ones that provide features that might be useful and are marginally priced.

Buy for the Future.

One thing's for sure, whatever you are buying will become obsolete in the not too distant future, if it's not already. Only research can help you avoid buying on the obsolescence end of the curve and instead buy the right technology that will last awhile. Beware the lurking technology transition!

On a more serious note, here are some ideas on how to survive the avalanche of technology, avoid the lure of its pitfalls, and hopefully succeed with it as the enabler to true process innovation.

3Pe. Long before you've reached the point of considering a technology acquisition, you should have performed the necessary policy and process analysis and exhausted all actions to climb the improvement curve before you need technology innovation to stimulate a new curve. The old adage which still resonates today is "simplify, integrate, and then automate." Furthermore, only by designing new policies, processes, and people requirements can you assure that you will achieve the benefits of the new technology. Many of us have experience with software implementation failures, never achieving projected benefits or "islands of automation" created in factory processes whose benefit is constrained by the unexamined limitations of the upstream or downstream process. In my experience at Dell, we initially worried that software providers or the competition would imitate our much prized configure-to-order IT functionality, until we finally came to the realization that we could give it to them, and it would make little difference, as most were not willing or able to make the fundamental changes in their business model and processes to successfully utilize it.

When the improvement curve flattens, then can innovation spur a new one?

Document Requirements.

Amazingly, even in the corporate environs, many still fail to perform due diligence with this important first step. Without completing the important analysis and determination of critical core requirements, technology acquisition decisions will be based on emotion, intuition, or some unspoken logic. Once you have decided that you need a technology enabler, distill the requirements list down as much as possible, notating critical needs versus "nice to haves". Get external consulting help if needed, as there are specialists in this field who can provide assistance in focusing and accelerating your technology search (e.g., TechnologyEvaluation.com).

Technology Partnerships are a Marriage.

Don't view this as a one-time purchase, but more of a long-term relationship where you will continue to depend on them for not only technical support, but also future development. Check out the potential partner's financial viability, install base, industry reputation, etc. to ensure that your new partner will be there when you need them.

From whence did all this technology come

So how and why did we get ourselves into this techno dilemma, or should we even be worried? Why concern ourselves with a personal analysis of consumer technology purchases? Consumer transactions drive over 60 percent of GDP and subsequently heavily influence the development and adoption of new technologies even by the business community. A study underway by Computer Sciences Corp. (CSC) titled "Consumer Technology Will Drive Corporate IT Agenda" (csc.com/features/2005/index) makes this point (e.g., text messaging). Maybe we can learn something from my not so unique consumer technology experience, whether we are consumers or provider/sellers. Regardless, these dynamics apply equally to either set of players. Here are some technology facts that I've personally deduced.

Technology gets cheaper and cheaper per unit of performance. By any measure, in whatever technology, performance per dollar increases exponentially as the technology matures, markets grow, and production costs decrease. I bought my first 20 mghz PC in 1991 for $1500. Thanks to the successful fruition of Moore's Law, I can now buy a 2.8ghz PC with 140X the processing power and many more accompanying features for $500.

High tech companies are "pushers". I worked for high tech companies for many years, so do not think me ungrateful in my assessment here. These companies prosper by developing and pushing technology into the marketplace, thus attempting to find or create markets for their products. Sometimes, innovative technologies begin without an obvious market or use and serendipitously become vital (xerography). Almost always, the performance curve of the technology advances faster than the performance needs of the market, which creates a market dilemma for these companies—how to get the market to buy more.

Again, the PC industry demonstrates this dilemma aptly. In the 90s, not only did many people buy their first PC, but major technological events such as the introduction of Windows or the advent of Internet usage stimulated necessary hardware upgrades by the marketplace—much like Stephen Jay Gould's concept of "punctuated equilibrium" in describing significant geological events that stimulated accelerated spurts of evolution, these technological events motivated technological leaps in development which the marketplace proved willing to absorb. However, as the frequency and impact of these previously significant PC technological events have decreased, reasons for buying a new machine have lessened. The latest consumer drivers, such as the connectivity between digital entertainment and PCs (photography, audio/video, etc) have helped, but have done more to stimulate purchases of these other digital devices than PCs. Thus, the industry growth rate has declined by 50 percent since the year 2000. Indeed, PC acceptance plateaued at about 60 percent market penetration among consumers compared to products like phones at over 90 percent and DVD players at 70 percent (after only seven years when most consumer technologies have only reached about half that rate).

Rejoice or beware disruptive technologies! Read Clayton Christensen's book, The innovator's Dilemma, to understand how new or initially lesser performing technologies overtake the needs of the marketplace for higher technologies and steal their customers. If you are on the customer end of this phenomenon, then rejoice. If you are a technology provider, then remember Andrew Grove's well spoken adage that "only the paranoid survive".

You get more than you bargained for. Because of the sometimes growing disparity between performance offered and performance required, product features are bundled to entice you to buy other capabilities not on your core list of needs, and thus you get the cell phone that photographs, e/mails, and web surfs when all you really wanted to do was make a phone call. Most technology sellers gain additional commissions, margin and profits when they "up-sell" or bundle other product features beyond the standard offering.


Confessions of a Techno Junkie

The first step towards solving a problem rests with admitting that you have one. I am writing today to confess that I have a technology problem, although I feel some comfort in the knowledge that I must not be alone. I'm of above average intelligence, college educated, have held executive positions with high tech companies, am not necessarily an early adopter, and pride myself on making logical, rational decisions, even when it involves acquisition decisions. OK, so I read David Taber's article in the February Parallax, ("The Taber Report: Customer Behavior"), which suggests that personal purchase decisions are 80 percent emotional, but still I can aspire, right? Before you read on, understand that this is no anti-technology polemic, but rather a statement on our collective struggle to absorb the technology streaming at us.

Anyway, my problem is best described as technology overdose. Specifically, I currently own more electronics technology than I can personally absorb in five lifetimes, and yet technology and I are far from finished. In my corporate life, although I always tried to make fiscally responsible technology decisions that improved competitiveness and delivered shareholder value, I must admit to being part of more than one attempted technology transition that ended with less than expected results. Scientific studies suggest that Man only uses 8 percent of his effective brainpower, so perhaps there is some direct correlation between our inefficiencies in neurological and technological utilization. I estimate that I effectively use 20 percent of the technological capability that I own, so hey, I'm way ahead of the curve! In a desperate attempt to improve my technological competence, I've even taken a subscription to Wired magazine to keep abreast of the latest in technological trends. It remains to be seen if this will really increase my utilization of currently owned technology or conversely just inspire me to buy more (digital radio, ipod, etc.).

Besides just the simple fact that I own more technology than I apparently need, I have deduced several other themes in my technology ownership. First, the devices that I use most effectively fulfill core needs in my life and somehow deliver lifestyle freedom. Second, because they are increasingly useful or even vital to my daily regimen, I use them more frequently which has concurrently motivated me to learn to take greater advantage of their technological offerings. Indeed, the lifecycle of technology evolves for me from interesting to useful to vital to replacement or obsolescence. My techno devices all fall somewhere in the useful to vital category, since I do generally manage to avoid buying at the interesting stage. The not so evident conclusion is hidden in those devices which I own (e.g., cell phone) with low utilization rates where I may have a vital need in buying that device, but received a lot more concomitant technology than I needed. For example, my state-of-the-art cell phone represents the best example of this. No mere phone, I can take pictures and instantly e-mail them, manage e-mail, peruse the Internet, send text messages, and use it anywhere in the world, except of course, in my home in Texas which is officially located in a "no coverage zone"!


Figure 2: When do you buy technology?

Technology as a business antidote

Now that I've made light of my personal experiences as a technology consumer, let's leap to the arena of corporate technology acquisition and discuss any parallels. Without the benefit of scientific analysis, most of us would probably agree that businesses do an equally poor job of purchasing and assimilating technology into the workplace to achieve intended benefits. We all have war stories of failed software and hardware implementations, and significant technology investments gone awry. Early in my career, I once worked for a company where we sadly joked that we owned more software licenses for applications that we had not implemented than for applications that were implemented. During the heyday of the dot-com era, "supply chain software" seemed to take a parallel maniacal trajectory and proliferated at a high rate. The difficulty as a practitioner rested in ferreting out what the core functionality of any of these packages actually contained and how they might fit into an overall supply chain IT infrastructure. Talk about techno confusion!

Many reasons contribute to the struggles which businesses have in assimilating new technology including poor technology or partner selection, lack of structured 3Pe analysis and redesign, incompetent project management, etc. But more fundamental than any of these, I believe, lies in the delusion that so many companies have suffered so much that they will buy a technological solution to their business problems. Yes, corporate entities, just like consumers, can get "drunk" on technology in their quest for success and mistake enabler for total solution. Corporations, much like we techno junkies, could probably stand some version of a 12-step program to right themselves in the battle for techno sanity.


Microsoft’s Underlying Platform Parts for Enterprise Applications: Somewhat Explained – Part 3

What About Visualization and User Interface (UI) Technologies?

However, what has somewhat intrigued me is Microsoft’s not-so-vocal touting and promoting of Windows Presentation Foundation (WPF), although it is an intrinsic part of the .NET Framework. In fact, to the best of my knowledge, the tool has not yet been used within the Dynamics set in earnest, although Lawson Software and Verticent would be the two independent software vendors (ISV) that I am aware of deploying it.

Both vendors tout WPF’s rich UIs that support virtually infinite customizations and business process compositions using Microsoft applications. Other Microsoft-centric ISVs either support only a limited number of specific and prescriptive business scenarios, or use a combination of technology products (for example, Microsoft Office Business Applications (OBAs), Visual Studio.NET, and proprietary interfaces and UI tools) to come up with similar custom scenarios. Again, Microsoft currently uses WPF very selectively in Dynamics UIs, for example, in the Dynamics AX graphical view of the organization structure of the business.

With its Smart Office offering, Lawson is not the first to leverage Microsoft Office to deliver not only manager and employee self-service, but much more as well. In fact, I could think of the joint SAP and Microsoft Duet product, Epicor Productivity Pyramid, QAD .NET UI, SYSPRO Office Integration [SOI]), IFS Business Analytics, and so on.

However, by leveraging WPF, Lawson embeds manager and employee self-service functionality more directly into Microsoft Outlook than Duet (which is more of an add-on launched from Outlook as an integrated pane) and most other vendors’ OBA solutions. Fore more details on Lawson Smart Office, see my earlier blog post on the vendor’s CUE 2008 conference and the Gartner Dataquest Insight report by Bob Anderson entitled “Lawson Raises the Bar With Differentiating ERP User Interface.”

Curiously, Lawson has deployed another non-mainstream Microsoft technology, Microsoft Office Groove. It is a peer-to-peer (P2P) collaboration platform, providing an outstanding base for collaboration (document exchange) scenarios that involve teams with sometimes disconnected participants. Microsoft claims that future product releases will improve the alignment for collaboration between Groove and SharePoint.

Lawson’s technology decision was likely owing to Groove’s concept of “shared workspaces” and Lawson’s view that individuals live in a “space” where they do most of their work. For example, a manager really “lives in” Microsoft Outlook, and should be able to do all his/her work from there. An accountant lives in Microsoft Excel and should be able to work from there. A mobile technician lives in the cell phone/personal data assistant (PDA) metaphor, where the Apple iPhone or Palm Treo similarity of UI can come in handy.

Some Other Vendors’ UI Approaches

Still, although WPF provides a visually appealing, familiar and intuitive UI, it comes with some trade-offs, specifically in memory utilization (being hardware intensive), the need to be hooked to the network, and a much greater dependency on Microsoft software. For instance, IFS doesn’t use WPF today for IFS Applications’ UI simply because of hardware needs: running WPF requires quite a hefty PC in terms of memory, and preferably the (possibly still unstable) Windows Vista platform.

We are talking here about IFS’ upcoming next-generation UI, which had for some time been called Aurora, but is now called IFS Enterprise Explorer (IEE). Namely, to prevent any confusion about Aurora being a separate product from IFS Applications, IFS has recently clarified its naming conventions.

Aurora is now a development project that will yield several enhancements to IFS Applications, all with a focus on ease-of-use and user productivity. The first deliverable as part of the Aurora project is IEE, the new graphical user interface (GUI) for IFS Applications. It is important to note that after IEE is released, the Aurora project will continue, yielding future enhancements.

In any case, IEE is interesting, to say the least, for leveraging Microsoft UI technology to create a look (albeit not yet the multi-touch touch screen, handgestures, etc. feel) of Apple iPhone (on top of Oracle database and Java-based application servers on the back end: some mix of technologies from adversaries, indeed). It is becoming quite obvious that the iPod and iPhone generation is our future workforce, who require well designed tools that they “love” to interact with. At the same time, they accept no excuses for “Why can’t I…?” questions, such as, for instance, “Why can’t I search in the enterprise application in the same way that I search on Google?”

At the end of the day, the design goal is to achieve more with fewer staff members, who thus have broader responsibilities, are able to handle the unexpected, collaborate with colleagues, and be more productive. In other words, the market drivers are the new and engaging design and user productivity. Consumer information technology (IT) and the web are leading the way, and are also becoming quite important for business applications.

To that end, prior to the IEE undertaking, IFS developed a pervasive enterprise search engine that attempts to think the way people think (e.g., “I need that fault report about the fire alarm not working”), and not the way enterprise systems think (i.e., “I want go into the preventive maintenance module where, in the service request folder, I will start the fault report screen, in which I shall then make a query on the description field containing any words followed by the words ‘fire alarm’ followed by any other words again”).

Monday, October 5, 2009

Is There a Panacea for Enterprise Software Pricing Yet

Joking aside, while enterprise software is apparently reaching commodity status, no one can determine with certainly what a fair price should be for a solution that fits the needs of most enterprises. If the price is dependent on the functional fit, then should a highly functional enterprise resource planning (ERP) or supply chain management (SCM) cost $1,200 per user? Should it be more? Less? Working on the assumption that the product fits the prospective customer's needs "like a glove"—and the importance of this fit cannot be overemphasized—what would be a fair price? Note that we're not necessarily talking about a "fair" price. After all, in the free market, pricing is based on supply and demand, the customer's state of urgency, and the perceived value of the product.

The real problem is how to compare one vendor's pricing against others in a like-for-like manner. After all, total cost of ownership (TCO) calculations should not be rocket science. One has to start by determining application software license fees (which occasionally include third-party software license fees). To this, add professional services costs (which typically includes training, implementation services, and so on), hardware costs, and annual support and maintenance costs (which may also include support for third-party software and hardware, such as bar code readers in the case of radio frequency [RF] applications).

Certainly, there might always be some optional software or service costs, but basically, TCO revolves around total software- and service-based costs. So what is the problem? Well, let us first tackle comparing common license fee pricing methods. When we are involved in the software selection process, we normally help clients come down to a few finalist solutions based on functional and technological fit, whereupon they then ask us to help them negotiate the best contract price, and finally select the winner. Yikes! Deciphering the US Tax Code is a simple task compared to that!

Plenty of vendors, of course, may produce nominal price lists for their software modules, but if chief information officers (CIO) at prospective user enterprises (or professional negotiators acting on their behalf) want to know the actual pricing benchmarks that similar customers have paid, they either must have insider knowledge, or a hotline to some of the vendor's reference enterprise customers. And of course, they can only gain access to this information when such details are not protected by confidentiality or non-disclosure agreement (NDA) clauses. Even then, these figures are typically muddled by an absence of like-for-like pricing models.

One vendor charges license fees per user (named, concurrent, or casual) and per module (or per bundle of functional modules in a suite). Another bases prices on server/central processing unit (CPU) size. And still another bases prices on its perception of how large (read "wealthy") the customer is, the likelihood of implementing a total functional footprint, and the total number of users the customer might ever want to have. What can one discern from all this? Such a scenario is truly comparing "apples with oranges," and then dumping berries into the mix. However, some vendors will rightfully say that they cannot simplify pricing because different customers want different pricing systems, and they have to do what customers want.

For example, many customers resent the notion of paying for functionality they are not likely to use. In some instances, even if a large corporation needs expanded functionally down the road, they will only pay for the modules they need at the time of selection. Lengthy implementations of (for example) financial management and consolidation, or human capital management (HCM) systems over several divisions and several hundreds (or even thousands) of users worldwide, may make an enterprise reluctant to embark on another lengthy implementation adventure right away.

Thus, a vast majority of business application software vendors still generate most of their revenues by selling their software licenses based on the number of named or concurrent users or seats, typically on a per module or per suite basis. They may also generate revenue based on an excessive "wall-to-wall," "all you can eat" functional scope. The accompanying implementation service, post-implementation service, and support and maintenance services also add to the "pot." These are all are priced as a percentage (and more often multiples) of the software license fees.

However, from the user perspective, there is a fitting analogy: in one trip to the store, no one will ever buy a lifetime's worth of snacks and coffee. We are selective, and over the course of a lifetime, we will buy different products based on what we feel we need or want. Software applications should be treated the same way. To return to our analogy: the stockpile of snacks will ultimately go stale, and the coffee will lose its flavor, or go rancid. Little is consumed, except hard-earned money.(see Application Erosion: Eating Away at Your Hard Earned Value). The same happens with software that is not used.

However, purchasing software la carte is easier said than done. The trouble comes from the "fine print" addendums in software purchase contracts, which are often longer than the main contact itself. These clauses are typically designed to protect the vendor from any future liabilities or to "nail" the gullible customer with extra costs. The fine print will customarily include a statement to the effect that the contract includes only the stipulated basic functionality. Any additional modules will use a different pricing structure, whereby the client will likely pay far more at the end of the day than if the whole application license had been purchased up front. Smaller enterprises (given their smaller number of users, and less complicated implementations), typically want most of the available functional footprint implemented at once, with the option to expand to additional functionality later. These customers might appreciate the option of "wall-to-wall" functionality up front.

Another pricing option variation is based on what users use, and not on what they can get. This is the so-called "pay as you go" option. Typically, enterprise software comes with parametric "switches" that can be set based on a need. For example, a small-to-medium enterprise (SME) user might set a switch to use multicurrency, consolidate financials, use warehousing management, or use standard or actual costing. Based on a contractual agreement, these switches can be set at the "software factory" to limit the functionality that is available to the user. If an enterprise decides later to activate a function, such as warehousing management, the enterprise would have to pay. While feasible, this requires serious contract management and tracking, and an argument could be made that SMEs that use less functionality require less support.

This brings us to the next fine print item, the point which is possibly among the most controversial: service and support calls. With some large software customers reportedly spending obscene amounts of money just to patch their existing software, the cost of poor software quality is becoming painfully apparent to software customers. Vendors typically provide a short period of time—from ninety days to a year—as a "warranty period," within which software fixes are available for free. After that initial period, software customers must pay for service and maintenance, on an annual subscription basis, or on a per-incident basis. As a result, many customers feel taken advantage of, and those who select the solution are presumed "guilty" of paying for the vendor's ineptitude to deliver quality software in the first place.

From this vantage point, many wonder whether the urge to hastily implement these systems prior to the Y2K deadline was so big that it blinded common business sense. Consequently users have ended up on the losing end of their contractual rights, and have become the victims of exorbitantly high costs for enterprise software use, upgrades, and administration.

However beside user frustration, vendors are facing the pressures of cutthroat competition, which is pushing prices downwards. Users also have strong expectations that improvements in software quality should allow vendors to significantly extend the "free" period to something closer to the expected life of the software.

Software customers have a legitimate right to expect their vendors to stand behind their products for a few years or more. During this time, vendors should back product capabilities based on the user's requirements (these capabilities are all too often easily "promised" by aggressive sales personnel). This backing would also make certain that the cost of the software product would remain fixed (and known) during that time. Through this approach, software vendors would have more difficulty in passing the expense of poor quality software to their customers through arbitrary rate increases for software maintenance. Rather, they would have to focus on producing better software to reduce their support costs, in order to remain viable players in the market.

Business-related issues related to software use inevitably arise daily. These issues may result from flaws or bugs in the software, a misunderstanding about how the software works, or both. In any case, software customers want to be able to contact a qualified expert to discuss the issue and resolve it in a timely fashion, with minimal impact on their business. The customer does not really care whether the issue arises from a bug, user error, or lack of knowledge.

Software vendors need to provide unlimited access to support services, including experts trained in the detection and fixing of design flaws in the software, as well as having the ability to explain (in lay terms) how to get the software to perform the functions required by the customer. To that end, no one wants to plow through the fine print to understand how many free-of-charge calls are permitted. Instead, it would be more reasonable to allow for unlimited calls during a multiyear warranty period.

Business processes and practices do undergo constant change (see What's Wrong With Application Software? Business Changes, Software Must Change with the Business). For example, changes in business strategy (such as shifts to lean manufacturing) can mandate changes to business processes. Regulatory changes, such as those stemming from the US Food & Drug Administration (FDA), the Heath Insurance Portability and Accountability Act (HIPAA), or the Sarbanes-Oxley Act (SOX), can also mandate changes to business processes. Software vendors have an obligation to keep their products from becoming obsolete even in these contexts, and customers have the right to expect that they can use updated products to address these issues, including reasonably easy and cost-effective upgrade processes.

One can only imagine the public outcry and revolt if car manufacturers decided to charge car drivers an annual fee—a percentage of their nominal car model price, say—indefinitely, for future developments! At this stage, software vendors are participating in a practice similar to that of pharmaceutical companies. Pharmaceutical companies justify the exorbitant prices of critical medications for consumers by professing the need to reinvest part of their hefty profits for future developments, but even in this case, customers know what they have to pay when they are at the counter. Maybe if enterprise software development could come up with a cure for cancer, AIDS, or Alzheimer's, or implement some other supreme benefit to humanity, then software users would also put up with the recurring service and maintenance costs.

Web-enabled Sales Tactics

From Inquiry Qualification to Suspect Development

Marketing's demand generation strategies have been significantly realigned over the last few years to encourage the use of the Web for registration and fulfillment activities. Enough information has been published about business to business (B2B) marketing of technology products that I'll simply recommend a Google search or exploring www.technologyevaluation.com for more information on how to generate more web inquires. It's marketing's job to attract first time visitors to your web site; this is the Problem awareness stage in the buy cycle methodology that we summarized in Part One. Unfortunately many sales people criticize their marketing department for generating so many "junk Internet" leads. Why, because the sales lead qualification process has been so unproductive to the point where it's actually counterproductive. We all know there is gold in those Internet registrations. Panning for that gold however, requires a different qualification process with a new set of metrics, so the message to marketing should be turn up the volume of those Internet registrations!

During the Understanding stage, as described in Part One. , as the buyer understands the problem and contemplates taking action to solve it, we begin qualifying buyers by creating a filtering process to identify the serious evaluators from other casual web traffic. Getting visitors to return for second and subsequent visits is a collaborative function between marketing and sales to create next buying stage transition offers that only appeal to the needs of the serious buyer. The objective is to create a series of follow-up offers that engage the self-directed buyer to further explore your solution, to reveal their interests, and to influence their buying process. Offer content is king in the on-line world and sending a free automated e-mail will achieve this much more effectively than the old qualifying phone call that delivers no immediate value to the buyer. For example, a next day follow-up offer to a white paper download would read something like, "Thank you for your recent inquiry. Please call Jane Doe, your sales executive, if you would like further assistance evaluating our solutions. You may also be interested in reviewing our Express RFI Service which outlines our solution capabilities and can easily be used to accelerate your project's needs analysis activity."

Smart web sites know how to identify these return visitors and to learn their interests from prior activity, so that it can deliver a more relevant and engaging visitor experience. Amazon.com is a good example of a site that tracks user history to create a profile that enables the site to personalize its service. The goal at this phase is to identify early stage sales opportunities, such as organizations with unusually high volumes of inquiry activity. By integrating this information with the CRM system, sales people would have access to this list of high probability suspects to further research off-line. Low activity visitors are classified as dormant opportunities and can be added to a quarterly awareness email campaign. The rules of engagement at this stage follow these general guidelines:

1. Respond to an inquiry via the same communication channel through which the inquiry was made.
2. Make it easy for visitors to transition from an impersonal on-line to a personal dialog with tools such as instant messaging and click-to-chat, or to move to an off-line discussion with a clearly displayed telephone number.
3. Delivery speed is paramount with any on-line offer. Immediate is best, the day after tomorrow is unacceptable.
4. Only request information that is in the buyer's self-interest to supply.
5. Follow-up every accepted on-line offer with related follow-up offer.


A well designed web site provides an ideal medium to collect buyer data. To transform buyer data into actionable information, it must be integrated into a customer relationship management (CRM) system to provide salespeople with a complete view of their prospect. The design of an effective Web-enabled sales system is akin to a multiplayer computer game where buyers are the players and salespeople are the coaches. As with any game, the content changes as buyers progress though each stage. The "buying game" tracks who downloaded what offers, what information they entered, and which web pages players reviewed. By consolidating all the information by organization in a CRM system, salespeople will be able to identify the companies that are in play.

A few key points will help to understand how the buying game concept applies to creating a winning influence strategy for the sales department. The first guiding principle is give to get. Give players (buyers) what they want to encourage them to continue down your buying path, and in return, get the information needed to maximize the buyers' value. The second principle is to create a learning game that gets smarter as it collects information. Welcome the buyer back and remember what he or she accomplished in the last visit. As with every game, there is a scoring system that coaches (salespeople) can use to easily monitor and help improve the performance of their players.

There are a few design issues to keep in mind for the web site:

* There are no rules.
* Players can joint the game at any point.
* Players can then proceed in any direction, and can skip around as they like.
* Not all players are buyers.

Finally, it's important to remember that your sales web site must fit into the overall buying game that a project team will play as they navigate the on-line world. Companies that deliver the most influential value at each buyer engagement are in the best position to eventually win the deal. The Research phase, as described in Part One is when about 80 percent of the vision is solidified into project documentation. Since most project teams are chartered to implement new business processes they find themselves learning as they go. They have two choices: they can either hire a consultant to show them the way or they can search the Internet for materials that will guide them down a proven path. Buyers are attracted to six categories of on-line project accelerator materials at this stage:

1. How to buy roadmaps.
2. Third party product reviews
3. Opportunity value calculators
4. Selection criteria check-lists
5. Needs assessment frameworks
6. Product overview materials

Sales and marketing organizations that don't offer these materials are missing the golden opportunity to transfer their value proposition into the project team's working documents. These project enablers can significantly influence the rest of the purchasing process. During the research stage, qualified buyers are transitioned from the public web site to a more in-depth members only site and then to an exclusive evaluation portal once a sales person is actively engaged with the account. Using Web portal technologies, the sales department can create a secure environment that provides buyers with convenient access to privileged information and tools. A portal also enables salespeople to control and monitor the buyer's on-line experience more closely as the sales opportunity develops over time. As the buying organization progresses through each subsequent stage, the salesperson continues to configure the portal with additional value enabling and information gathering materials.

When is the best time to engage with a buyer? It's the magic moment when the buyer is visiting your web site. That is the time when buyers are giving you access to their needs and their full attention. Sales, your challenge is to seize these golden opportunities to probe for information, deliver value, and to influence the buying process. Granted your salesperson will not physically attend this virtual sales engagement, but a sales meeting did happen, and your buyer was influenced by the experience. Do your salespeople know what sales meetings their buyers have completed on your web site? Do they have access to the information that was collected? Nothing is more expensive than a missed opportunity.

When you begin your next sales negotiation, think about the valuable buyer information you never bothered to collect from your own web site while the buyer was collecting information on you. Both the marketing department and the customer support department have implemented efficient and effective web strategies which have improved service levels and reduced costs. Now is the time for the sales department to fill the void and regain their influence on the customer's decision process at each stage of the new on-demand buy cycle.

The Web-Enabled Sales Process

Today, nearly every business to business (B2B) information technology company I talk to is mad that its attempts to increase new account sales have failed. This has grown into a huge problem—to the point where a significant number of companies have decided that they are not going to take it any more, and have totally abandoned new account growth strategies. However, by leveraging new technology, understanding the buy cycle value chain and enabling today's self-directed buyer, sales organizations can significantly increase revenue and reduce costs.

Since the tech bust that followed Y2K, technology companies have become more and more frustrated by their attempts to win new business. Sales departments across the industry have tried all the traditional sales strategies: improving the quality of the sales force by replacing non-performers with proven professionals; improving staff knowledge by conducting sales training programs; and reorganizing into specialized industry verticals. They have expanded market coverage through reseller programs; created dedicated telemarketing teams to generate more leads; and implemented customer relationship management (CRM) systems to improve relationships with prospects and customers. Yet with each initiative, the cost of sales has escalated, and with each quarter end new account revenue results have been more and more disappointing.

Consequently, sales management teams have been under intense pressure to keep their attention focused on the final act of closing the deal. The problem is that the sales department has spent too much time repairing the symptoms of the sales problem and has avoided dealing with its root cause: regaining the ability to influence the purchasing decision process. To effectively influence the purchasing process, sales must find new ways to identify buyers earlier; to collect buyer information; to gain buyer access; and to provide added value. Yet, this task is particularly daunting for business to business (B2B) enterprise system providers where decision processes span months. Sales tactics that worked well in the past to identify, access, and influence decision makers are no longer effective.

In the past, salespeople controlled the sales cycle by managing the flow of information. Today the information available on the Internet has empowered buyers to structure their own buying cycle. This shift is comparable to the days when the automobile engine replaced the horse as our primary source of transportation power. The horse couldn't compete with the speed, convenience and low cost of the automobile. The same can be said for the Internet, which has given buyers shopping tools and conveniences that didn't exist a few years ago. The time has come to completely reengineer the go-to-market strategy to make it compatible with the buyer's preferred mode of communication. Buyers will always be attracted to the least risky, most convenient, and lowest cost information outlet. The Internet fits these criteria and has become the primary channel that buyers use to complete many of their early stage buying tasks, and it continues to be an important channel of influence throughout the entire process. As a result, yesterday's successful consultative salesperson is being excluded from much of today's buying process.

I'm not suggesting that the Internet will make the salesperson obsolete. Personal selling will always play a vital role in managing the overall enterprise relationship whether it is conducted via the mail, on the phone, in person or across the Internet. What I am suggesting is that sales department personnel should think twice before they dial the phone or pack their bags to visit a client, and instead should consider clicking a mouse to deliver more effective support to prospects evaluating solutions on-line.

The prominence of the Internet has grown exponentially in a few short years. According to the PEW Internet & American Life Project, between 1999 and 2000 the Web became the "new normal" way of life. Back then few of us realized how easy it would be to shop on-line. Now we can simply log-on, search a few ideas, review product features, compare prices, select a vendor, and have a product arrive at our door the next day. As we enter 2006 over 70 percent of us enjoy a rich media experience from our home, which is driving an on-line shopping growth rate of over 30 percent a year. Life in the on-demand world, as characterized by the iPod, allows us to tune-in to our interests and tune-out everything else. We have all learned to screen phone calls, to skip commercials, and to block spam so we can tune-in to exactly what we want, when we want it.

Each morning when we arrive at work we bring these newly acquired habits and expectations with us. Is buying big, complex enterprise level systems really that different from personal shopping? It can be compared to the process of buying a major capital item, such as a house or a car. There is an old auto industry adage that the busiest day of the week on a car dealer's lot was Sunday, the day the dealership was closed. Has the Internet become the modern equivalent of visiting the dealer's lot on Sunday? According to a ZDNet Research statistic "in 2003, 94% of US consumers shopping for a car went on-line to do research, get quotes from dealers and to order brochures. This compared to 67% who actually visited a dealership when making a decision on which car to buy." While eventually the buyer will go to the car lot to test drive and buy the car, the preliminary research to create a shortlist is being done on-line.

It's human nature to avoid unsolicited sales contact. People are very uncomfortable with the emotional aspect of the buyer-salesperson relationship. It is not high-pressure sales tactics that are the source of this anxiety. The problem is that a person's sense of obligation grows as a personal relationship develops, and so to does the pending dread that all but one of these relationships will have to be broken. As the song goes, "breaking-up is hard to do", and we know that salespeople don't accept no easily. In the past, buyers sacrificed service and drove to the dealer's lot on Sunday to avoid these awkward situations. Today, buyers are avoiding the fear of relationships by simply going on-line, and are getting access to better information than what salespeople ever provided. In nearly a third of car buying situations the buyer's decision is already made before he or she arrives at the dealership. The same applies for decision-makers who are seeking enterprise level solutions. As a result, many salespeople will never get the opportunity to position their solution, and a lucky few will not get their opportunity until much later in the buying process.

So who's qualifying who these days? When salespeople get a lead, they instinctively make the qualifying phone call to determine the prospect's pain, power, vision, value, and control. Qualification is the first step in the old sales cycle because a time consuming and expensive discovery or needs assessment engagement is assumed to be next. However, today's buyers don't want a vendor's assistance at this early stage. With the help of on-line information sources, buyers would rather research and complete their own unbiased needs assessment study. The irony with the old qualifying call is that by the time a salesperson qualifies an opportunity, he or she will be too late to have a significant influence on the purchasing process. According to a 2004 lead qualification study by KnowledgeStorm, traditional qualification parameters are missing a significant market opportunity. The study estimated that 40 percent of early stage buyers were disqualified by sales because they hadn't determined the answers to the qualifying questions yet, and the study estimated that another 40 percent refused to answer these questions just to avoid sales contact. As a result, salespeople are missing the opportunity to influence 80 percent of today's buyers during the most impressionable stage of a project. Would salespeople dare consider the possibility that they can do more selling without being there? The eureka moment struck me, when I realized that the more a buyer can do without personal sales assistance the better.

Many marketing and sales departments think in terms of the "end game" of the value proposition of their solution. All too often they forget that the winner is always the team that scores the most points at each play of the game. A selling approach designed around the buyer's information consumption process keeps salespeople focused on earning value points throughout the buy cycle. For example, a buyer's end game problem may be solved by your supply chain optimization product, but right now, the buyer just needs to schedule a realistic project plan. If your competition has a better plan to offer than you do, then it just scored an influence point. As the saying goes, "it takes a lot more than a better mouse trap to win a deal". In other words, salespeople need to follow the buy cycle and fulfill the buyer's needs at each consumption point along the way.

The decision process for an enterprise level system is defined by the corporate project life cycle within which the purchase falls. As with any business initiative, these projects can germinate from a variety of sources, but once sponsored as an official project it follows a relatively predictable decision process. To stay focused on the customer's buy cycle we use the PURCHASE acronym to designate eight separate purchase decision-making stages. The following is a brief description of each stage along with a few appropriate value offers that sales can provide:

1. Problem. In a pre-contemplation mode individuals search the Web to gain an awareness of the latest problem solving innovations, industry issues and business trends. These education seekers are willing to register an e-mail address to gain access to interesting on-line information. Marketing departments are currently doing a good job of providing business issue white papers, customer case studies, and product brochures. However, sales qualification resources are being wasted on the inquiry registrations that are generated from this segment. Automated follow-up offers should be sent to these inquiries to determine their interest level with an option to subscribe to a newsletter or register for preferred access to additional information.

2. Understanding. In this contemplation mode, a group of individuals unite within an organization to understand a specific problem in an effort to propose a possible solution strategy. They continue to search and gather the information necessary to build the business case required to establish an official corporate initiative with executive sponsorship. The sales strategy for this stage is similar to the problem stage with an additional element. Data mining will analyze buyer web site activity by organization to identify suspect accounts with increased activity levels for sales to research and possibly target offline as a high probability suspect.

3. Research. In a preparation mode a project team works to formalize a project structure to deliver a solution to the organization. The group's psychology immediately transitions to that of a more pragmatic early adopter mindset. The focus shifts from understanding the problem to creating the vision and charting a path to a solution. Since this is new ground for the organization, the team searches the Internet for project enablers such as evaluation roadmaps, third party reviews, budget calculators, needs assessment templates, and project plans. While today's self-directed buyer may be keeping the salesperson physically out of the process, they are happy to use their project-enabling resource downloads. Smart sales organizations are transferring their value propositions into the working documents of project teams in the form of needs assessment spread sheets, return on investment (ROI) calculators, and other project templates. High quality project enabling materials can provide a valid business opportunity to engage earlier than the competition to begin building a trusted personal relationship.

4. Comparison. The project team transitions into the evaluation phase with a clear vision, and a shortlist of qualified vendor organizations. Salespeople are engaged to visit for the first time to continue selling where their on-line sales collateral ended. At this point the buying team knows exactly what they want to see to complete their final evaluation. The concept of a "non-disclosure level" evaluation portal should be introduced by the salesperson at this stage. Salespeople should empower the project team with access to a standard array of high quality e-collateral portal content (presentation, demonstration, and testimonials) designed to address the standard evaluation issues so they can focus on solving the prospect's higher value business problems. By creating a collaborative environment with an empowered project team, project members can become an inside sales force motivated to get the organization's buy-in for their project. By monitoring portal activity, sales can evaluate its competitive position based on each contact's individual activity level.

5. Homework. Preparation for authorization is a very active internal stage when key project team members work to justify a recommended action plan and preferred solution. They prepare the detailed capital authorization documents, and begin planning the implementation. Often the salesperson is told he or she is one of two finalists, just to keep them honest through negotiation. But truth be known, there is a third alternative, a "no-decision." A delay or no-decision is the typical outcome when the project team submits a weak business case to management. By offering expert help with the use of the project enablers transferred in the Research stage, the sales team can earn the opportunity to collaborate on the internal business case.

6. Authorization. This is an internal sales activity where the project team has to sell its business case to a very conservative, risk-adverse executive group that is emotionally disconnected from the project. Given the amount of senior management scrutiny, project team members are highly motivated to win approval for their project. While this phase may drag on longer that expected, sales organizations have three primary objectives; to monitor their competitive position, to maintain team member enthusiasm, and to defend against competitive attacks. By linking the business case to portal based e-collateral, sales can monitor approval activity levels. By offering pre-implementation e-learning materials an enthusiastic project team can get a head start on the next phase of the project which will also distract members from having the time to listen to competitive attaches.

7. Signing. This stage begins as the buyer prepares to negotiate the deal and continues until the first payment is received. Pre-negotiation posturing has been going on for a while as buyers focus on mitigating risk issues and threaten sellers with the other viable alternative. Buyer information is invaluable at this stage. The project team members are instructed to be very vague as the buying negotiator "holds his cards very close to his chest". By maintaining engaging installation and pre-implementation content in the evaluation portal, sales can monitor buyer usage activity to determine their level of commitment. Nice words from the buyer that is not accompanied with corresponding activity is an early indication of a serious sales problem, while tough talk and a high activity level are indicators of a strong position.

8. Expansion. Once the solution is successfully implemented the organization looks to leverage the solution's success across other areas of the business. At this point the new customer is transferred to a customer support portal which would include an evaluation capability for additional products and services.

Understanding the different stages of the buy cycle and finding the appropriate value offers is only half the job. The next challenge is getting access to the right people and collecting the right information to deliver the best value. Let's think about this for a minute. The people you want to access are those visiting your web site. They are right there registering for exactly what they want. The golden opportunity lies with the visitor on your web site: you have the access, they have the need, and they are willing to provide information, if you can deliver immediate value.

Who Could Object to Faster, More Responsive Supply Chains?

If anything is certain in today’s global supply chains, it is the constant change and volatility that does not let anyone relax—not even for a moment. This unsettling pattern is the result of the globalization trend and its related evolution of supply chains.

Enterprises can choose one of two types of integration in the supply chain management (SCM) constellation: vertical or lateral (horizontal) integration. APICS Dictionary (11th edition) defines vertical integration as

the degree to which a firm has decided to directly produce multiple value-adding stages from raw material to the sale of the end product to the ultimate consumer. The more steps in the sequence, the greater the vertical integration, and a manufacturer that decides to begin producing parts, components, and materials that it normally purchases is said to be backward integrated. Likewise, a manufacturer that decides to take over distribution and perhaps sale to the ultimate consumer is said to be forward integrated.

In other words, vertical integration, or vertical SCM, refers to the practice of bringing the supply chain inside the four walls of one organization. Traditional vertical integration, or the ownership of most (if not all) parts of a supply chain, is the method of SCM that long preceded the relatively recently coined term "supply chain." By bringing most of the supply chain activities in house and putting them under corporate management, vertical integration has basically solved the problem of who should design, plan, execute, monitor, and control supply chain activities.

One often-cited example of vertical integration, as described in the APICS Certified Supply Chain Professional (CSCP) Learning System; Module One—Supply Chain Fundamentals (2007), is the automobile company built by Henry Ford, which often receives credit as being especially successful using this approach. In the early days of the automotive industry, Henry Ford pursued a strategy of owning and controlling as many links in the automobile supply chain as possible, from rubber plantations to raw material for tires, right on through to dealerships that distributed finished cars to the public. In an attempt to create a self-sufficient enterprise, the automotive giant also owned iron ore mines, steel mills, and a fleet of ships, as well as the manufacturing plants and showrooms that built and distributed the cars bearing his name.

The primary benefit of vertical integration is control, since a department or wholly owned subsidiary with no independent presence in the marketplace cannot, for example, deal with competitors to sell its components or services at a higher price. Its operations should, theoretically, be completely visible to the parent company, as well as be synchronized with other company functions by directives from the top. The corporation’s schedules, workforce policies, locations, and amounts produced (i.e., all aspects of its business) are controlled by the overarching management.

Vertical integration may still exist nowadays as a viable way of managing a supply chain. Wireless phone companies are an example of the type of business that operates this way. They purchase the phones, stock them at retail outlets, sell them, provide coverage, and handle warranty service.

Vertical integration generally went out of vogue as corporations expanded and as global supply chains became over-extended. Indeed, lately it has become quite difficult for any complex corporation to bring together the expertise needed to excel in all elements and countless activities of the supply chain. Therefore, most modern corporations have turned to outsourcing those aspects of their business in which they believe themselves to be least effective. Even Ford Motor Company, the pioneer of vertical integration, has been no exception to this trend. A couple of years ago, the company’s management publicly acknowledged that “the days of being 100 percent self-sufficient and capable in today's world of high technology and engineering are gone.”

Rather than bring all supply chain functions in house, large manufacturers and service providers are now more likely to adopt a horizontal, or lateral, supply chain strategy, whereby separately owned entities focus on their individual core competencies and deal with each other through discrete transactions or by longer-term contracts. The complexity and expense of managing all the activities in a global supply chain often drives top management to sell off assets not directly contributing to the core business. Ford divested itself of the production of many components in house, as did DaimlerChrysler in shedding its Mopar division, and General Motors (GM) in letting go its component supplier division.

Lateral arrangement has thus replaced vertical integration as the preferred approach to managing the many diverse activities in the supply chain. Once corporate ownership abandons the idea of vertical integration and turns to outsourcing various activities, it loses control of those aspects of the supply chain, and it has to deal with separately owned companies as suppliers or customers. Nevertheless, this has been the dominant trend in the evolution of SCM in recent decades in the Western world.

There are some compelling reasons for relying on a lateral supply chain, starting with the ability to achieve economies of scale and scope. Namely, regardless of how large and resourceful a corporation is, its internal supply chain functions lack economies of scale when compared with the potential capacity of an independent provider of the same product or service. Another reason is the ability to improve business focus and expertise, since vertical integration in a globally competitive market brings about the complexity of managing disparate business units spread across international borders, time zones, continents, and oceans. Conversely, an independent partner company that focuses entirely on its particular business can develop more expertise than an in-house department can, leading to more attractive pricing, higher quality, and quicker time to market.

Additionally, with the advent of the Internet and advanced communication technology, many of the traditional barriers to doing business at a distance and in a distributed manner have been eliminated. Near instantaneous communication means that information can be shared collaboratively through, for example, videoconferencing, instant messaging (IM), or voice over Internet protocol (VOIP), around the globe. Thus, as the world becomes one single, huge marketplace, it makes sense to deal with established companies that intimately know their local markets. Horizontal supply chains are also the logical extension of outsourcing, as they are closely related to the “virtual corporations” trend.

In the virtual corporation, the firm capabilities and systems are merged with those of the suppliers. This results in a new type of corporation, one where the boundaries between the systems of the master firm and of the suppliers disappear. Virtual manufacturing is the changed transformation process that is usually found in the virtual corporation. Because the firm’s and the suppliers’ systems are merged, the components provided by the suppliers are not related to the firm’s core competency; however, the components managed by the firm are related to core competencies. One of the many benefits of the virtual factory is that it can restructure itself quickly to respond to changing customer demands and needs. Likewise, the dynamic nature of a virtual corporation also allows for change to its relationships and structures in response to the customer’s changing needs.

Virtual Supply Chains Have Their Limitations

Although it may be easy to become infatuated with the attractiveness of lateral supply chains and virtual organizations, the unfortunate fact remains that synchronizing the activities of a network of independent firms can be extremely challenging. What each member enterprise might gain in scale, scope, and focus, it may lose in the ability to see and understand the multitier supply chain processes and their interdependencies, as well as the ability to control them.

Horizontal integration indeed brings about the complexity of the global supply network, with multiple connections around the world and information shared on networks, all connected along the chain. The outsourcing of manufacturing operations is a growing trend, and it offers numerous cost-savings and other benefits for original equipment manufacturers (OEMs) and brand owners. However, there is a trade-off, as outsourcing manufacturing operations also increases complexity because it creates virtual enterprises, where data and operations reside within the disparate systems of third parties.

Further challenging the channel masters is the increasing volatility of customer demand. This unpredictability makes it critical for the supply chain to be more agile and responsive in order for companies to be successful. Brand owners are accountable for their brand, quality, and customer satisfaction. Meeting the increasing number of compliance regulations requires them to coordinate their trading partners’ activities as well as to quickly and confidently respond to any and all changes. To do this, brand owners need multi-enterprise visibility across their supply chains, both internal and external.

The electronics industry is a good example of a business sector that has been particularly affected by the increase in outsourced manufacturing. Brand owners, OEMs, and contract manufacturers (suppliers) all face the implications of growing global competition; shorter product life cycles; intense innovation, which results in the constant launch of new products into the market (despite failure of most of these products); complexity of products’ features and distribution operations; and unpredictable demand.

These days, the electronics industry (like most industries) must operate in an evermore challenging consumer climate: product or service quality must be a given; product price often gives way to availability or special product features; and hard-to-please, well-informed consumers are a mere click away from learning about competitive offerings or from posting their dissatisfaction with a seller’s poor service at heavily visited consumer advocacy web sites.

In other words, in the cutthroat and competitive marketplace of the electronics industry, electronics companies are realizing that the factors upon which they compete (so called “order winners”) are changing. With numerous competitive options and vast consumer resources to research and compare products, no selling company will survive with an inferior product, unjustifiably high prices, or a non-responsive supply chain (i.e., suboptimal customer service).

Furthermore, while outsourcing and lateral supply chains provide the nominal price advantages of sourcing from low-cost countries (see Understanding the True Cost of Sourcing), on the downside, they often come with longer order lead times and frequent disruptions because there are so many intermediaries between the brand owner and the contract manufacturer (such as distribution centers, inventory hubs, regional sales centers, consignment inventories at retailers, etc.).

Add to this frequent customer requests for product configuration changes (often after the initial order has already been placed) and frequent in-house engineering change requests (ECRs) due to the need for constant innovation and ever shorter product life cycles, one could only imagine the ramifications for enterprises still relying on inadequate, traditional, forecast-based planning and related push-oriented manufacturing strategies (i.e., along the “if we build it, they will come” mantra).

Electronic companies that still rely on such outdated concepts can certainly “pick their poison” (means of failure): decreased customer satisfaction (increasing customer erosion); missed revenue or earnings per share (EPS) goals, which, in turn, lead to inability to win new bids; poor key performance indicator (KPI) metrics (in terms of poor inventory turns, wrong inventory mix, excess and obsolete inventory, margin erosion, etc.), and so on.

The electronics industry today is made up of the type of virtual enterprises mentioned earlier, where brand owners, contract manufacturers, and lower-tier suppliers are interconnected partners in a coordinated operation. In such an environment, one member's actions will affect many other members, and as such, major decisions cannot be made in isolation. In fact, decisions require consultation and input from all those that can influence or be influenced by them. Thus, the market drivers discussed above have made the supply chain an increasingly influential part of a company's success or failure, but they have not made the supply chain manager’s job any easier.

Internet-based technological advances have not necessarily changed the “old” mind-sets and practices of relying on traditional supply chain applications, which have major visibility and information gaps. Namely, while “the best laid plans of mice and men” can be made, typically, as the saying goes, they “often go astray,” meaning everything unravels when “the rubber meets the road” (when those plans are executed). Only those manufacturing, distribution, and supply chain environments that are extremely fortunate might experience only minor or manageable changes occurring between the planning and execution stages. Unfortunately for most companies, such changes are hardly ever minor. Rather, they are endless variances between planning and forecasting (the “ideal world” of ivory towers) and fulfillment (which takes place in the treacherous “real world” of the manufacturing and distribution trenches).

For all the investment made in sophisticated demand management tools, almost proverbially, the only sure thing about a forecast is that it will be, by and large, wrong. Forecasts routinely miss actual demand (in fact, they are rarely better than 70 percent accurate, according to some findings within industries with volatile demand, such as consumer electronics). This can result in disastrous inventory pileups, missed financial targets, and supply chain conflicts among brand owners and their supplier networks. Overly optimistic forecasts can lead companies to lose touch with actual demand signals, and leave billions of dollars in excess inventory in the pipeline. The example of Cisco Systems’ multibillion-dollar write-off of obsolete inventory in the early 2000s still speaks volumes in this regard.

Conversely, a lesser-known fact is Apple's overly pessimistic forecast of the initial iPod sales a few years back. Although some might think that discovering that the actual demand for your product far exceeds your forecasted demand is a good problem to have, it was only because of Apple’s lean and flexible supply chain structure that the runaway success of iPod has not turned into an embarrassing disaster.

Thus, since alignment of demand and supply is an increasingly difficult challenge in the unpredictable electronics environment, companies should not spend a great deal of time and resources trying to predict customer demand. That is to say that planning has become less effective. A much more important capability for organizations now is to be able to rapidly and astutely respond to what is happening at the moment.