I am thinking about the future of the media industry and what is happening to the entertainment and media mix. eBooks are outselling paperbacks, cable TV is loosing subscriptions to streaming services, hard copy purchases of music and video recordings are loosing ground to digital purchases and newspaper subscriptions have been going down for the past 14 years even though the number of households has been growing. But there are winners too. Radio advertisement revenues have been growing since 2010 and box office ticket sales are bigger than ever! The US entertainment and media industry over all is growing at a faster pace than GDP. The media pie is not becoming smaller, but rather evolving in its ability to tell stories. Making content digital, mobile accessible, available for real-time streaming and allowing consumers to buy increments with micro-payments are just foundational elements that will enable transmedia storytelling or cross-media integration.
Jeff Parkin’s xTED video tells the story of transmedia story telling extremely well: https://www.youtube.com/watch?v=to2IZzhRCs0#t=1067
Cross-media integration is being adopted by large corporations. Unilever’s Susan Glenn Axe cross-media integrated campaign is an example of the potential in cross-media integration, gaining over double brand recognition compared to their main competitor. Or Sprint’s Does Your Phone Dream When it is Shut campaign is edgy, mobile, interactive and data driven. Both examples are media rich, but only mildly integrated. We are only seeing the tip of the integration ice berg.
From a technology enablement perspective transmedia story telling requires:
- Both business and content management supported by powerful analytics and compute elasticity
- Concept development and resource management
- Content acquisition, licensing and management
- Content planning, scheduling and quality control
- Advertising sales and scheduling
- E-Commerce and billing
- Automation, master control and digital platform
- Digital delivery
To the audience it is all about the content experience, which requires a large subscriber base, comprehensive media mix and rich multi form factor devices ecosystem to enable participation.
It has been a while since I’ve reflected on the economy. The last couple of years have seen steady growth, but mostly in stock prices through corporate stock buy backs. The true economy has grown at a far lesser speed. One might say that for this reason current stock prices are inflated. However, the bull run seems to continue. November and December have historically seen increases in stock prices, but what will happen when consumer buying drops in Q1FY15?
The second half of 2015 will start being affected by the 2016 Presidential Elections, which typically means that all legislation slows and until the election is done, and the winner announced, there will be a long period of uncertainty. Even if a correction happens in the first half the uncertainty will slow down the rebound.
The European economy has slowed down for multiple reasons. Some are structural and others related to uncertainty with Russia. Germany is the largest consumer of Russian gas and could cut the pipe off buying its energy from other markets, but many of the other European nations do not have that strength. The formal sanctions on Russia have mostly been a nuisance. The real penalty is from US test selling stock from its strategic oil reserves and pressuring its oil producing allies to add supply to the market. The Russian economy is solely based on energy sales and the drop in energy prices is crippling the Russian economy more than any sanctions could. In many ways Russia’s renewed military capability and ambition is not aligned with its economic base. A military is only mighty if it has strong economy to sustain its commitment.
As I look at the US economy I see a slow down in the fundamentals, regardless of the stock markets. The housing come back has stalled and even taken a step back. Unemployment gains are regional with net winners and losers. Middle class pay raises have been on hold for years. When we talk of the trickle down effect, the theory is predicated on the notion that there is actually trickle and that trickle is large enough to fuel consumer spending, which in turn drives the economy. We now have an engine coughing with a constricted fuel line. What we are currently doing is consolidating wealth at the top with negligent trickle and it doesn’t take a rocket scientist to figure out that the model will not work for long and the engine will stall. The power to make the engine turn again comes from the battery reserves and not the fuel tank. We are all proverbially in the same car and the current strategy is not optimal from a from distance travelled perspective, even thought the MPG reading might look good now.
Chairman Yellen recently spoke about her concerns on the wealth gap. The rise in stock prices through stock buy backs has made the wealthy even wealthier. The wealthiest at some point will cash out, which will cause a strong market correction. The stock held by the middle class is slower to react and will again carry the brunt of the blow. With a correction the gap will become even wider. The middle class will loose in the correction and the wealthy will reinvest in the bottom reaping the gains of the next cycle. Some times the correction is uncoordinated and that is when we have a depression and even the wealthy loose.
What the current government has failed in is outlawing inversions, limiting stock buy backs, simplifying the tax code, enabling the repatriation of wealth and reducing corporate taxes to be more inline with the global average. All these actions would increase the trickle effect as capital would need to be forced back to work at home. The administration talks about supporting the middle class, but it has been largely words. We fixed the housing catastrophe at the tax payers dime, but we did not fundamentally change anything. The nation needs leadership with long term vision and the willingness to make hard short term sacrifices. We need structural change badly. I am more republican than democratic in my views, but we need enlightened strong leadership, which we’ve been lacking for too long.
Next generation is a topic that often comes up in a major technology transformation. Rather than put the same old product on a new platform, many elect to start from a fresh slate, asking themselves what can I do in this new technology paradigm and more importantly what will my customers need in the future?
We are often faced with having to explain to our customers why they should be interested in the new infrastructure option and it is a very valid question. The benefit may be in a lower cost or generally better performance, but those arguments do not differentiate against any other solution having made the same infrastructure decisions. Justification based on cost alone often doesn’t serve to maintain strong product margins either. When we fully embrace and incorporate the potential of a new platform or paradigm, we are no longer forced to compare the old versus the new, as the new offers business benefits that were not technically feasible or economically viable with the old.
Cloud computing is an example of a technology transformation that can have significant next generation implications. Migrating and existing on premise solution on to cloud infrastructure may offer cost and performance benefits, but it will not differentiate. Starting form a clean slate and fully embracing Platform-as-a-Service and what the platform, as well as the platform ecosystem, can enable in a next generation sense will lead to more compelling value promises and true thought leadership. Cloud in itself isn’t transformational. Virtualization and Software-as-a-Service licensing models have been around for quite a while. It is what we do on it that can be.
Big data is one of those terms that really means nothing or pretty much anything. In multi-tenant environments we generate vast quantities of data that can be easily mined and analyzed across the whole population of customer accounts. Data privacy and proprietary nature of data in certain industries may cause concern, but if we only analyze a higher level sanitized abstraction of the data with large enough populations these concerns should not be an issue. Benchmarks can be of great value as they tell us how we are performing in relation to our peers. Adding layers and streams of additional data can further enhance the value of our core data set. When we apply analytics to the data we can generate deeper and more valuable insight. Statistical analysis can help us predicts outcomes and analysis of outcomes can enable systems to prescribe action.
Mobility really became a factor with 4G speeds and the proliferation of smartphones and slates. Next generation services must account for a mobile workforce. Mobility often leads to a reassessment of the user interface. This is in line with the general ‘applification’ trend in software solutions. Some take the approach of mobile first and in cases mobile only.
Gameification has become prevalent in social networking. In an enterprise context gamification starts with enablement of collaboration. Collaboration evolves into social enterprise. When we add e-leaning to the mix we have the base ingredients for gamification. Here is a great article on gamification in and enterprise context: http://www.hcamag.com/opinion/get-your-employees-off-the-bench-and-in-the-game-178027.aspx
We never really start from a clean slate, as we have legacy and/or open source code libraries. The first build is a minimum viable product (MVP) that is good enough to compete with market entrants and forms a platform for a modular development roadmap executed through agile sprints. Modules can be included in the core price of the solution or can be add-on in nature. They can be functionality, content or services. With next generation services offerings we need to think through average revenue per user (ARPU) and maximizing reoccurring profit, rather than revenue. The difference is that in maximizing revenues we are focused on top line sales and in maximizing profit we also pay attention to delivery performance. Modularization often leads to e-commerce and in product discovery. Opening the platform to third parties rounds it off as an ecosystem content platform.
The software industry is going through dramatic changes and traditional enterprises are not immune to this transformation. For most traditional industries software based services are less than 5% of overall gross revenues. As processes become more automated and machine controls become increasingly digitized enterprises are faced with vast quantities of data. Even most software companies have yet to fully explore their data play, leaving money on the table. Data is increasingly the new currency.
Traditional enterprises will continue to improve core processes, develop new materials, engineer more efficient machines, research new fuels, etc., but as consumers and as business customers, we increasingly expect even traditional products and services to come with applications that are facilitative, collaborative, analytic, predictive and even prescriptive, in order for us to be able to maximize the return on our investments. Intelligent systems enable data to flow across an enterprise infrastructure, spanning the devices where valuable data is gathered, to back end systems where that data can be translated into insights and action.
The first step is to structure, collect and display data in static reporting format. The design of the data architecture needs recognize that this is only a minimum viable offering and that the design must support additional data streams, data merges, benchmarking and analytics. As the amount of data stored increases the architecture must allow for this with minimum degradation in quality of service.
No system exists in a silo, but rather as a component of an ecosystem of solutions and services. Integration with possible value adding third party data streams or overlays should be considered. Examples of overlays are geological, geospatial, socioeconomic, etc. Value adding data streams can be threat data, macro-/microeconomic data, ERP/CRM data, social feeds, weather data, etc. Layering data or merging data increases the value of our core data set generating a wider range of insights. To fully monetize our core data enterprises should also consider if other parties within the ecosystem could use the data to generate value.
In addition to adding depth to services, enterprises can also add breadth. Value chain integration, expansion to adjacent markets and addressing external stakeholders (such as communities, local governments, etc.) can increase breadth of the addressable population. Each stakeholder persona has their own needs and motivations that add complexity to the services being offered and marketing messaging.
Software based ‘overlay’ services to traditional core products and services can increase utilization, satisfaction and loyalty.
The way we interact with machines is evolving. I remember the lines outside electronics stores when Nintendo Wii first came out. The controller wand revolutionized gaming. Since then Sony has come out with their own wand and Microsoft upped the game with Kinect. AquaTop is cool derivative use of Kinect, where a touch surface made out of a pool of water… and the innovation goes on and on.
Virtual touch has gone mainstream with controllers, such as Leap Motion and the Haptix project. Touch gesture control already dominates all mobile devices from slates to smart phones. We are slowly breaking the boundaries between human and machine. Virtual touch gestures are more intuitive and mimic how humans generally interact with their surroundings through motion.
I remember working with 3D virtual touch sensors in 2006. At the time solutions were very limited and virtual touch grid was very basic. At the time we were already envisioning storefront screens where passersby could interact with applications through the store window without touching the glass or a screen. We envisioned lobby portals with interactive building maps. Screens for changing booths that would overlay product information to enhance the shoppers experience. Philips 3D autostereoscopic LCD monitors are still a bit on the expensive side, but represent the future of 3D. 3D will not take full flight until we, as consumers, can get rid of those ridiculous glasses. When we fuse full 3D projection and accurate virtual touch I think we have taken a huge step in complexity of expression and interaction.
Windows 8 is definitely designed with mobile and touch in mind. I recently purchased the Surface Pro and I use it now as my primary work PC. I have it hooked into a Polycom speaker (no more wired headsets) and a monitor. I have Bluetooth mouse and keyboard. This was my first foray into Windows 8… and even though I am waiting for 8.1 to go official, I really don’t understand all the talk about Windows 8 being so different or difficult. It took me about a day to figure it out and become fully productive. Now I could not go back to Windows 7 anymore. Imagine Windows 8 with a autostereoscopic monitor, where the tiles not just on a two dimensional plane, but can be stacked three dimensionally. Where live tiles offer 3D content. The thing that I think makes Windows 8 next generational is the integrated nature of services and the seamless flow. What if we could integrate services in three dimensions and overlay data in 3D.
I love movies that are fantastical and push the limits of our imagination. In Harry Potter newspaper pictures are live and have depth. In Minority Report we project screens into the air and interact with virtual touch. In Tron we merge the human consciousness with the virtual world. I don’t think we are that far off with any of these examples.
I’ve now done quite a few business modeling sessions with ISVs about operationalizing their cloud strategies. Every case has its unique needs, but there are some commonalities as well.
When asked about services layer elements, like monitoring, metering, billing and provisioning, all claim to have a handle. However, when you scratch the surface none have all services elements thought out and automated. A whole ecosystem ecosystem of services layer partners exists. Some are more mature than others, but all seem to require a degree of configuration and customization to work with complex enterprise solutions. Green Button plays in an interesting space with regards to burst compute provisioning, but I cannot see ISVs agreeing to their margins for long. The ability to provision compute from hot nodes on demand should be an in built PaaS feature.
Financial transaction modeling is also an area were ISVs need help. It is not that mature ISVs would not have advanced financial systems for accounting or that they would not be able to calculate their cloud costs. It is more a need to have the ability to do sensitivity analysis based on average deal sizes. What is required for breakeven? What is realistic? What is in it for the channel? Often ISVs real their modeling at reaching an acceptable breakeven model, but neglect to calculate what profit margin that would offer for partner who is looking to commit a full time resource to promoting the solution. Partners are not in it to breakeven. They want a profit margin on top of cost of sales and/or cost of support.
ISVs that are not used to SaaS pricing need to rethink their incentive models. A 100k on premise solution with 20k S&M component, for a partner with 40% margin on first year, will provide less than half the profit margin as a reoccurring SaaS deal, all else being equal. We need to look at the life time value of the deal. If we were to provide 10% on subsequent years for both S&M and for SaaS renewal the partner would still not benefit from a SaaS sale over an on premise sale. Assuming that a typical technology depreciation cycle is 5 years, a SaaS partner would not breakeven with an on premise partner before the on premise partner sells a whole new solution with perpetual licensing and the whole cycle starts again. Pricing in all models needs to be based on roles and responsibilities of parties concerned. The question is what are the roles and responsibilities in a SaaS model and how they differ from a perpetual license sale?
Value articulation can also be tricky. ISVs often fall victim to separating their core value and value derived from the cloud. Faster, cheaper and easier associated with cloud in general doesn’t differentiate any more. What that specific solution can do when powered by the cloud differentiates and should be integrated into the core value messaging. One value often under utilized is big data inherent to large multitenant solutions. This degree of benchmarking and data analytics has not been possible on premise and is one of the core value adds of the cloud.
Consider the Windows 8 smartphone ecosystem. Nokia is known for mapping, HTC now for audio with Beats Audio and Samsung will like be know for integration with smart appliances. So whom else would fit this picture?
It has been long rumored that Facebook will come out with a smartphone. It would definitely lead the pack as the socially inspired device. What about Electronic Arts or Activision coming out with a gaming inspired device? What about a Disney or Warner inspired phone? How inspired could a Virgin Mobile device really be? What if RIM went Win 8 and came out with an enterprise information worker inspired device? Would they still lead the pack? Coca Cola spends millions researching daily patterns of consumers to optimize the ‘aah’ moment. What would a Coca Cola inspired phone be? How about a phone inspired by an automotive giant? Real time readings on oil viscosity, wireless engine tuning, etc.?
HP has announced that they will get back into the smartphone game, but who will they inspire? What will their uniqueness be? How will they differentiate, when hardware, form and usability have become commodity? Will HP allow another brand (maybe one listed above) inspire their device or will they bring to market a portfolio of inspired devices for different market segments?
At the end of the day yesterdays inspiration gets consumed by the ecosystem and becomes commodity. Device manufacturers need continue to evolve and inspire consumer sin new ways. In this sense the game has become even more competitive and unforgiving.
I would predict that Apple will lose out because how ever inspired they are they cannot ultimately compete with the aggregated inspiration of an ecosystems. Apple is also too locked down and Android is too open… Win 8 is just right.
What kind of phone do you inspire to have?
A couple of days ago it came to me that solutions that gather data are starting to be commodities. We gather multiple formats of event data… geo-location, barometric, social, video, etc. Just as example if I were to fly to another city. I might socially broadcast that I am taking trip linking to others that will be at the same destination or taking the same ride. I sign in at the airport terminal, hotel, restaurant, etc. adding geo-location dimension to the trip. I might be tagged in a picture from the trip. I might even have been caught on a number of video surveillance cameras. Some actions might exceed threshold and trigger workflows. The point is that for any action we take in life today a lot of meta data is collected.
Another area is statistical data on data. Averages, means, medians, etc. I am flying to Chicago… my peers on average fly to Chicago two time more often per month than I do. They get upgraded more frequently. During their trips they eat more sushi. During their trips the sun shines more often. The tone of their social feeds on average is more positive. They even pay $200 less per ticket on average.
In some cases data is being generated by the very nature of being, doing and mother nature. In other cases the data is user generated. We gather data, we generate data and we mesh data do create new dimensions from the raw data. Yet we have not really put any thought into the data. We throw compute and algorithms at the data to detect patterns and statistically study cause and effect. And lastly we can add human thought to draw conclusions. The further we link and study data the more valuable the output product will come. The more variables we introduce into the equation the more complex and encompassing the results will be.
We will invent new forms of data collection for years to come, but as consumers of data we also want higher levels of abstraction… more valuable data. We want it more easily, cheaper and faster.
EXISTING SOLUTION VENDORS… explore your solutions and the data that you sit on. Think of the reference architecture you belong and integrate into and the potential for data interchange. Those multitenant solution in the cloud, what is your data play?
START UPS… don’t think of an independent app that performs a function in a silo. Think the reference architecture you want to be in and what value you can bring by gathering, creating, linking and analyzing data for greater value to the user.
BE DATA CENTRIC!
Everyone knows what Business Analytics (BI) is, but do you know what Workforce Analytics is? The term is used on a Deloitte Human Capital Trends 2011 report, but even Wikipedia doesn’t define what it means. SuccessFactors, an SAP company, says: Drive fact-based business decisions with reliable workforce insights. Deloitte report states ‘moving from reactive to proactive’ as a driver. So in summary: using workforce generated data for proactive, rather than reactive, fact-based business decision making.
So where does this workforce data come from? Deloitte mentions ERP and HR systems having collected such data for a decade. Information security logs are a far more richer data source. Infosec logs will slice and dice every event from an application, client and network perspective. Most mature organizations also have decades of infosec logs that have a key stroke by key stroke record of every workforce action event from the data that recording started.
Business Analytics can be classified into the following categories of analytics:
- Marketing Research & Analytics
- Supply Chain Analytics
- Legal research and process analytics
- Services Operations Analytics
- Business and Technical Analysis
- Data Analytics
- Pharma and Healthcare Research
- Financial Services Research
- Human Resources Research
- Intellectual Property (IP) Research
In each case we are examining the business looking from outside within. Workforce analytics provides the same data, but looking outside from within. In the next years we will see a new breed of analytical platforms analyze and provide insights from workforce data. Eventually workforce and business analytics data warehouses will be linked to form a single comprehensive analytics view.
Social Analytics is neither from outside or from within. Social Analytics is a reflection of the companies actions outside their domain. As contextual search and social analytics develops, we will be able add this additional real time dimension to truly agile business decision making.
Business do not sell to businesses. People sell to people. Business is a human endeavor. This is why sentiment analysis adds an additional dimension to business decision making. From work force analytics we know what our people did. From Business Analytics we see how our business performed, as a result. From Social Analytics we see how the world around reacted. Sentiment analysis gives us an indication of why.
We already today have advanced data models for forecasting stock market behavior, weather patterns, etc. This is all statistical based on projecting from historical data points. If we what the people, how business performed, how the world reacted and how everyone felt about it contextually, then run quite advanced predictive models on pretty much any action. Its like being able to look two chess moves into the future. You still would not see the ultimate outcome, but you could predict immediate cause and effect. How will a customer react to this price point? How will consumers react to this marketing campaign? How will my staff react to firing of a certain individual? How would my team be impacted if the team leader switched companies? Which parts of my organization have the most impact on revenue generation and where the optimal resourcing thresholds are?
Dr. Gavin Michael, Scott Kurth and Michael Biltz have written the Accenture 2012 Technology Vision statement. Typically these documents, in my opinion, are broad and obvious, but in this case I was positively surprised.
The highlights are:
– Context-based services
– Converging data architectures
– Industrialized data services
– Social driven IT
– PaaS-enabled agility
– Orchestrated analytical security
To me the above are all interconnected and parts of the same conversation…
Contextualization can be through aggregating multiple dimensions of data (such as the social driven dimension) and having the ability to data mine across the aggregated data set. Human analytics will be slowly automated using AI to empower the masses. Putting events in geo context is obvious. Having AI that understands sentiment and word associations is a bit less obvious. The futuristic goal would be AI that places events in conceptual context. This is only enabled if we collect and aggregate data on events from multiple perspectives. I’d claim that cloud and PaaS (Azure), to be specific, are driving integration and data aggregation. ‘A maturing platform-as-a-service (PaaS) market will shift the emphasis from cost-cutting to business innovation.’ In this sense I would have focused more on the ‘innovation’ potential in this report, rather than the deployment agility angle… but hey, I am not a doctor.
Converging data architectures is about mixing structured and unstructured data. In cloud environments data is split in structured relational data and unstructured blob data. Unstructured is much cheaper operationally and hybrid architecture optimization is key to minimizing operating costs. Distributed data from a cloud perspective means that we tap into data from multiple services and not just from the two structures.
The Azure Data Marketplace is a marketplace for industrialized data services. I ask all ISVs that I talk to what their data play is and to date I have not stumbled upon a single one with an aggregate data monetization plan. To me this means that a whole heck of a lot money is being left on the table.
Orchestrated analytical security is an interesting highlight at the end of the report. I have put a lot of thought in this and to me its is just one possible manifestation of innovation based on maturing PaaS. Devices do not cause data loss… people do. In this sense device management is always reactive. Enterprise network security has matured to the point where it is preventing security breaches in real-time against known attack profiles and possible permutations of those known attacks. I believe that the future is more a kin to the movie Minority Report. The more we are able to aggregate data on events from different perspectives and intelligently analyze that information the closer we are to being able to truly predict behavioral patterns and prevent data loss even before it occurs. Orchestrated analytical security is really a possible outcome of the other identified themes and as such sticks out in the report, but definitely a nice closer to the report that gets our imaginations turning.
In a past blog I used a macro economic study of the emerging markets to understand cloud ecosystem growth issues. According the macro economic study the reason why China is outpacing India is because China fully utilizes also the female population and has a higher general level of education. This applies to cloud vendors as well. Accenture’s vision is only possible if the whole partner ecosystem is educated on the ecosystem nature and innovation potential of PaaS. Would be interesting if Accenture would reach out to their clients to see if the clients understand and can envision the vision.