As I start my long journey home from Las Vegas to Boston, I’ve had some time to reflect on what I learned at the Adobe Summit. I learned a lot here – and not in the way I expected. Here are the 8 big things I learned in the last few days.
- Adobe is entering their next transition phase. Adobe CEO, Shantanu Narayen, kicked off the conference with a talk abut the business transformation phases that Adobe has gone through in the last several years. They went from a desktop software company to a cloud-based creative solution with Creative Cloud. Then they transitioned to the Marketing Cloud over the last few years. He then announced the big transition to the Adobe Experience Cloud. With the mixed audience of technical and creative professionals, I think that message was missed on many. This is actually a big deal. Narayen has successfully navigated the company through some difficult transitions, and I think he has just hit his stride. Moving to an experience company broadens the company’s market and drives deeper relationships with their customers along the way.
- AI and Machine Learning everywhere with Sensei. I have to start with a tip of the hat to the name “Sensei”. It is a great combination of the senses required for AI to confident guide that comes along with Sensei. I stopped counting references to machine learning and AI after the first 50 or so in the opening session. In the beginning, I expected lots of references to AI – just because every company in the world is positioning themselves as an AI company – but it was clear to me that they understood the applications of AI that made sense for creating great experiences.
- Giant alliances forming in Enterprise AI. The biggest “a ha” I had of the week was the fact that the world of enterprise AI is aligning into a few key alliances. On the heels of a major announcement linking Salesforce and IBM Watson to create AI for sales, Adobe and Microsoft announced their partnership to drive AI-based experiences. Both companies are coming from their position of strength in the market (Salesforce with sales and Adobe with marketing), and I suspect they will soon start to put more pressure in the area of customer service, which is currently dominated by Nuance. I left that opening session with the feeling that Microsoft could credibly be a major player in Enterprise AI – and they should abandon their efforts with Cortana, which has been a lackluster player in the consumer voice assistant market. Partnered with Adobe, they are in a strong position to win.
- Mastering the platform. The other remarkable thing about the conference was the consistency of the platform. They have done a great job building a complex set of products into an integrated platform. I’m sure it isn’t all unicorns and rainbows, but they are clearly putting their focus on building an integrated platform. One key signal that they made to support that strategy was the appointment of Brad Rencher as their EVP of Adobe Experience Cloud, which is comprised of all the Adobe ingredient parts.
- Religion is Sticky. The downside of a platform strategy is that is forces you to pick a religion as a customer. You are really quite committed to Adobe as a partner if you drink the Experience Cloud Kool-Aid. I had a few conversations with technical people who were sucking up all the product training they could, and it was clear that they were being brainwashed into a way of thinking. The result is the Apple analog for Enterprise – if you adopt Adobe Experience Cloud end-to-end, you can create amazing experiences, like the ones we saw from Adobe customers including T-Mobile, Carnival Cruises, and FRANKE. The only problem is that if you decide to leave the religion, you will probably spend years in detox.
- Great Innovation Pipeline – from “Sneaks”. At the beginning of the conference, I was concerned that Adobe had run out of ideas (see “mistakes” below). That sentiment changed completely when I saw the “Sneaks” session at the conference. The session is a peak into innovations that are in the near-term horizon that Adobe is testing with their customers for feedback. The Sneaks session is a fun event, this one was co-hosted by Kate MacKinnon from Saturday Night Live fame. Kate is super talented, but tried way too hard to be funny – it was actually a little painful to watch. The highlight was the innovation presented by engineers working on this stuff in the labs. We saw video ads inserted to virtual reality, AI-based geo-targeting and automatic design based on personalization segments. The best part was that they showed the demos in the current platform – showing that their platform strategy continues to be a focus – and highlighting the fact that customers of the platform will benefit from future innovations.
- Even the giants make mistakes. For all the important messages I took from the opening keynote, I actually found it a little disappointing on a couple levels. My first issue was that the opening session felt a little disjointed to me. I’ve done big events like this and I know it is really hard to control your “talent”, but some of the opening session segments were a little disappointing. The biggest issue I had was with their head of innovation. I was excited to hear about all the cool new stuff that was going to come from Adobe, and instead, I saw an awkward presentation of stuff that wasn’t really relevant. The first demo he showed was probably the most interesting – it showed how they used deep learning to tag photos from National Geographic with far improved accuracy. The problem with photos from someone like National Geographic is that AI-based systems looking to analyze the content would come back with “forest” or “outside” for almost all of their stuff. In contrast, the Adobe demo showed much more relevant tagging within the specific National Geographic domain. Unfortunately, it went down from there. The second demo was about using speech recognition – and it showed me that he clearly didn’t understand the application of speech. His demo was about cropping a photo with voice commands – a really silly application of voice technology. They could have easily shown how natural language commands could navigate a complex system for a user, or captioning of video and auto-tagging, but he chose something that was best suited for a touch UI. The next piece was about annotation in virtual reality. It was kind of interesting, but not that compelling. The final piece was a demo of working with photos in virtual reality – and it fell victim to the demo gods. Ultimately, he wasn’t able to show the whole demo, but it didn’t matter because it seemed like a demo of technology in search of an application. Thankfully, they really redeemed themselves in the Sneaks session later in the week.
- Keeping it real. The last thing that I would mention is that fact that the did an excellent job demonstrating their products in their platform. It was tricky in the big rooms to see the detailed screens, but it was very clear that they were working with real capabilities in their platform, not with PowerPoint demos.
Did you go to The Summit? I’d love to hear what you learned. Feel free to comment below.