Next generation is a topic that often comes up in a major technology transformation. Rather than put the same old product on a new platform, many elect to start from a fresh slate, asking themselves what can I do in this new technology paradigm and more importantly what will my customers need in the future?
We are often faced with having to explain to our customers why they should be interested in the new infrastructure option and it is a very valid question. The benefit may be in a lower cost or generally better performance, but those arguments do not differentiate against any other solution having made the same infrastructure decisions. Justification based on cost alone often doesn’t serve to maintain strong product margins either. When we fully embrace and incorporate the potential of a new platform or paradigm, we are no longer forced to compare the old versus the new, as the new offers business benefits that were not technically feasible or economically viable with the old.
Cloud computing is an example of a technology transformation that can have significant next generation implications. Migrating and existing on premise solution on to cloud infrastructure may offer cost and performance benefits, but it will not differentiate. Starting form a clean slate and fully embracing Platform-as-a-Service and what the platform, as well as the platform ecosystem, can enable in a next generation sense will lead to more compelling value promises and true thought leadership. Cloud in itself isn’t transformational. Virtualization and Software-as-a-Service licensing models have been around for quite a while. It is what we do on it that can be.
Big data is one of those terms that really means nothing or pretty much anything. In multi-tenant environments we generate vast quantities of data that can be easily mined and analyzed across the whole population of customer accounts. Data privacy and proprietary nature of data in certain industries may cause concern, but if we only analyze a higher level sanitized abstraction of the data with large enough populations these concerns should not be an issue. Benchmarks can be of great value as they tell us how we are performing in relation to our peers. Adding layers and streams of additional data can further enhance the value of our core data set. When we apply analytics to the data we can generate deeper and more valuable insight. Statistical analysis can help us predicts outcomes and analysis of outcomes can enable systems to prescribe action.
Mobility really became a factor with 4G speeds and the proliferation of smartphones and slates. Next generation services must account for a mobile workforce. Mobility often leads to a reassessment of the user interface. This is in line with the general ‘applification’ trend in software solutions. Some take the approach of mobile first and in cases mobile only.
Gameification has become prevalent in social networking. In an enterprise context gamification starts with enablement of collaboration. Collaboration evolves into social enterprise. When we add e-leaning to the mix we have the base ingredients for gamification. Here is a great article on gamification in and enterprise context: http://www.hcamag.com/opinion/get-your-employees-off-the-bench-and-in-the-game-178027.aspx
We never really start from a clean slate, as we have legacy and/or open source code libraries. The first build is a minimum viable product (MVP) that is good enough to compete with market entrants and forms a platform for a modular development roadmap executed through agile sprints. Modules can be included in the core price of the solution or can be add-on in nature. They can be functionality, content or services. With next generation services offerings we need to think through average revenue per user (ARPU) and maximizing reoccurring profit, rather than revenue. The difference is that in maximizing revenues we are focused on top line sales and in maximizing profit we also pay attention to delivery performance. Modularization often leads to e-commerce and in product discovery. Opening the platform to third parties rounds it off as an ecosystem content platform.
The software industry is going through dramatic changes and traditional enterprises are not immune to this transformation. For most traditional industries software based services are less than 5% of overall gross revenues. As processes become more automated and machine controls become increasingly digitized enterprises are faced with vast quantities of data. Even most software companies have yet to fully explore their data play, leaving money on the table. Data is increasingly the new currency.
Traditional enterprises will continue to improve core processes, develop new materials, engineer more efficient machines, research new fuels, etc., but as consumers and as business customers, we increasingly expect even traditional products and services to come with applications that are facilitative, collaborative, analytic, predictive and even prescriptive, in order for us to be able to maximize the return on our investments. Intelligent systems enable data to flow across an enterprise infrastructure, spanning the devices where valuable data is gathered, to back end systems where that data can be translated into insights and action.
The first step is to structure, collect and display data in static reporting format. The design of the data architecture needs recognize that this is only a minimum viable offering and that the design must support additional data streams, data merges, benchmarking and analytics. As the amount of data stored increases the architecture must allow for this with minimum degradation in quality of service.
No system exists in a silo, but rather as a component of an ecosystem of solutions and services. Integration with possible value adding third party data streams or overlays should be considered. Examples of overlays are geological, geospatial, socioeconomic, etc. Value adding data streams can be threat data, macro-/microeconomic data, ERP/CRM data, social feeds, weather data, etc. Layering data or merging data increases the value of our core data set generating a wider range of insights. To fully monetize our core data enterprises should also consider if other parties within the ecosystem could use the data to generate value.
In addition to adding depth to services, enterprises can also add breadth. Value chain integration, expansion to adjacent markets and addressing external stakeholders (such as communities, local governments, etc.) can increase breadth of the addressable population. Each stakeholder persona has their own needs and motivations that add complexity to the services being offered and marketing messaging.
Software based ‘overlay’ services to traditional core products and services can increase utilization, satisfaction and loyalty.
The way we interact with machines is evolving. I remember the lines outside electronics stores when Nintendo Wii first came out. The controller wand revolutionized gaming. Since then Sony has come out with their own wand and Microsoft upped the game with Kinect. AquaTop is cool derivative use of Kinect, where a touch surface made out of a pool of water… and the innovation goes on and on.
Virtual touch has gone mainstream with controllers, such as Leap Motion and the Haptix project. Touch gesture control already dominates all mobile devices from slates to smart phones. We are slowly breaking the boundaries between human and machine. Virtual touch gestures are more intuitive and mimic how humans generally interact with their surroundings through motion.
I remember working with 3D virtual touch sensors in 2006. At the time solutions were very limited and virtual touch grid was very basic. At the time we were already envisioning storefront screens where passersby could interact with applications through the store window without touching the glass or a screen. We envisioned lobby portals with interactive building maps. Screens for changing booths that would overlay product information to enhance the shoppers experience. Philips 3D autostereoscopic LCD monitors are still a bit on the expensive side, but represent the future of 3D. 3D will not take full flight until we, as consumers, can get rid of those ridiculous glasses. When we fuse full 3D projection and accurate virtual touch I think we have taken a huge step in complexity of expression and interaction.
Windows 8 is definitely designed with mobile and touch in mind. I recently purchased the Surface Pro and I use it now as my primary work PC. I have it hooked into a Polycom speaker (no more wired headsets) and a monitor. I have Bluetooth mouse and keyboard. This was my first foray into Windows 8… and even though I am waiting for 8.1 to go official, I really don’t understand all the talk about Windows 8 being so different or difficult. It took me about a day to figure it out and become fully productive. Now I could not go back to Windows 7 anymore. Imagine Windows 8 with a autostereoscopic monitor, where the tiles not just on a two dimensional plane, but can be stacked three dimensionally. Where live tiles offer 3D content. The thing that I think makes Windows 8 next generational is the integrated nature of services and the seamless flow. What if we could integrate services in three dimensions and overlay data in 3D.
I love movies that are fantastical and push the limits of our imagination. In Harry Potter newspaper pictures are live and have depth. In Minority Report we project screens into the air and interact with virtual touch. In Tron we merge the human consciousness with the virtual world. I don’t think we are that far off with any of these examples.
I’ve now done quite a few business modeling sessions with ISVs about operationalizing their cloud strategies. Every case has its unique needs, but there are some commonalities as well.
When asked about services layer elements, like monitoring, metering, billing and provisioning, all claim to have a handle. However, when you scratch the surface none have all services elements thought out and automated. A whole ecosystem ecosystem of services layer partners exists. Some are more mature than others, but all seem to require a degree of configuration and customization to work with complex enterprise solutions. Green Button plays in an interesting space with regards to burst compute provisioning, but I cannot see ISVs agreeing to their margins for long. The ability to provision compute from hot nodes on demand should be an in built PaaS feature.
Financial transaction modeling is also an area were ISVs need help. It is not that mature ISVs would not have advanced financial systems for accounting or that they would not be able to calculate their cloud costs. It is more a need to have the ability to do sensitivity analysis based on average deal sizes. What is required for breakeven? What is realistic? What is in it for the channel? Often ISVs real their modeling at reaching an acceptable breakeven model, but neglect to calculate what profit margin that would offer for partner who is looking to commit a full time resource to promoting the solution. Partners are not in it to breakeven. They want a profit margin on top of cost of sales and/or cost of support.
ISVs that are not used to SaaS pricing need to rethink their incentive models. A 100k on premise solution with 20k S&M component, for a partner with 40% margin on first year, will provide less than half the profit margin as a reoccurring SaaS deal, all else being equal. We need to look at the life time value of the deal. If we were to provide 10% on subsequent years for both S&M and for SaaS renewal the partner would still not benefit from a SaaS sale over an on premise sale. Assuming that a typical technology depreciation cycle is 5 years, a SaaS partner would not breakeven with an on premise partner before the on premise partner sells a whole new solution with perpetual licensing and the whole cycle starts again. Pricing in all models needs to be based on roles and responsibilities of parties concerned. The question is what are the roles and responsibilities in a SaaS model and how they differ from a perpetual license sale?
Value articulation can also be tricky. ISVs often fall victim to separating their core value and value derived from the cloud. Faster, cheaper and easier associated with cloud in general doesn’t differentiate any more. What that specific solution can do when powered by the cloud differentiates and should be integrated into the core value messaging. One value often under utilized is big data inherent to large multitenant solutions. This degree of benchmarking and data analytics has not been possible on premise and is one of the core value adds of the cloud.
Consider the Windows 8 smartphone ecosystem. Nokia is known for mapping, HTC now for audio with Beats Audio and Samsung will like be know for integration with smart appliances. So whom else would fit this picture?
It has been long rumored that Facebook will come out with a smartphone. It would definitely lead the pack as the socially inspired device. What about Electronic Arts or Activision coming out with a gaming inspired device? What about a Disney or Warner inspired phone? How inspired could a Virgin Mobile device really be? What if RIM went Win 8 and came out with an enterprise information worker inspired device? Would they still lead the pack? Coca Cola spends millions researching daily patterns of consumers to optimize the ‘aah’ moment. What would a Coca Cola inspired phone be? How about a phone inspired by an automotive giant? Real time readings on oil viscosity, wireless engine tuning, etc.?
HP has announced that they will get back into the smartphone game, but who will they inspire? What will their uniqueness be? How will they differentiate, when hardware, form and usability have become commodity? Will HP allow another brand (maybe one listed above) inspire their device or will they bring to market a portfolio of inspired devices for different market segments?
At the end of the day yesterdays inspiration gets consumed by the ecosystem and becomes commodity. Device manufacturers need continue to evolve and inspire consumer sin new ways. In this sense the game has become even more competitive and unforgiving.
I would predict that Apple will lose out because how ever inspired they are they cannot ultimately compete with the aggregated inspiration of an ecosystems. Apple is also too locked down and Android is too open… Win 8 is just right.
What kind of phone do you inspire to have?
A couple of days ago it came to me that solutions that gather data are starting to be commodities. We gather multiple formats of event data… geo-location, barometric, social, video, etc. Just as example if I were to fly to another city. I might socially broadcast that I am taking trip linking to others that will be at the same destination or taking the same ride. I sign in at the airport terminal, hotel, restaurant, etc. adding geo-location dimension to the trip. I might be tagged in a picture from the trip. I might even have been caught on a number of video surveillance cameras. Some actions might exceed threshold and trigger workflows. The point is that for any action we take in life today a lot of meta data is collected.
Another area is statistical data on data. Averages, means, medians, etc. I am flying to Chicago… my peers on average fly to Chicago two time more often per month than I do. They get upgraded more frequently. During their trips they eat more sushi. During their trips the sun shines more often. The tone of their social feeds on average is more positive. They even pay $200 less per ticket on average.
In some cases data is being generated by the very nature of being, doing and mother nature. In other cases the data is user generated. We gather data, we generate data and we mesh data do create new dimensions from the raw data. Yet we have not really put any thought into the data. We throw compute and algorithms at the data to detect patterns and statistically study cause and effect. And lastly we can add human thought to draw conclusions. The further we link and study data the more valuable the output product will come. The more variables we introduce into the equation the more complex and encompassing the results will be.
We will invent new forms of data collection for years to come, but as consumers of data we also want higher levels of abstraction… more valuable data. We want it more easily, cheaper and faster.
EXISTING SOLUTION VENDORS… explore your solutions and the data that you sit on. Think of the reference architecture you belong and integrate into and the potential for data interchange. Those multitenant solution in the cloud, what is your data play?
START UPS… don’t think of an independent app that performs a function in a silo. Think the reference architecture you want to be in and what value you can bring by gathering, creating, linking and analyzing data for greater value to the user.
BE DATA CENTRIC!
Everyone knows what Business Analytics (BI) is, but do you know what Workforce Analytics is? The term is used on a Deloitte Human Capital Trends 2011 report, but even Wikipedia doesn’t define what it means. SuccessFactors, an SAP company, says: Drive fact-based business decisions with reliable workforce insights. Deloitte report states ‘moving from reactive to proactive’ as a driver. So in summary: using workforce generated data for proactive, rather than reactive, fact-based business decision making.
So where does this workforce data come from? Deloitte mentions ERP and HR systems having collected such data for a decade. Information security logs are a far more richer data source. Infosec logs will slice and dice every event from an application, client and network perspective. Most mature organizations also have decades of infosec logs that have a key stroke by key stroke record of every workforce action event from the data that recording started.
Business Analytics can be classified into the following categories of analytics:
- Marketing Research & Analytics
- Supply Chain Analytics
- Legal research and process analytics
- Services Operations Analytics
- Business and Technical Analysis
- Data Analytics
- Pharma and Healthcare Research
- Financial Services Research
- Human Resources Research
- Intellectual Property (IP) Research
In each case we are examining the business looking from outside within. Workforce analytics provides the same data, but looking outside from within. In the next years we will see a new breed of analytical platforms analyze and provide insights from workforce data. Eventually workforce and business analytics data warehouses will be linked to form a single comprehensive analytics view.
Social Analytics is neither from outside or from within. Social Analytics is a reflection of the companies actions outside their domain. As contextual search and social analytics develops, we will be able add this additional real time dimension to truly agile business decision making.
Business do not sell to businesses. People sell to people. Business is a human endeavor. This is why sentiment analysis adds an additional dimension to business decision making. From work force analytics we know what our people did. From Business Analytics we see how our business performed, as a result. From Social Analytics we see how the world around reacted. Sentiment analysis gives us an indication of why.
We already today have advanced data models for forecasting stock market behavior, weather patterns, etc. This is all statistical based on projecting from historical data points. If we what the people, how business performed, how the world reacted and how everyone felt about it contextually, then run quite advanced predictive models on pretty much any action. Its like being able to look two chess moves into the future. You still would not see the ultimate outcome, but you could predict immediate cause and effect. How will a customer react to this price point? How will consumers react to this marketing campaign? How will my staff react to firing of a certain individual? How would my team be impacted if the team leader switched companies? Which parts of my organization have the most impact on revenue generation and where the optimal resourcing thresholds are?