Making Sense Of Adobe Sensei

A combination of machine learning and deep-learning capabilities embedded into Adobe technologies, Sensei tackles many of today’s complex experience challenges that go well beyond the creative department.

Making Sense Of Adobe Sensei

Necessity might be the mother of all inventions, but it takes many forward-thinking minds to figure out what customers want before even they realize it.

That’s the main reason why so many companies have been putting real money and resources behind establishing “innovation labs.” Indeed, a 2015 study by Altimeter Group found that 38% of the leading 200 companies have teams dedicated to staying on top of digital trends and leading their innovation efforts.

But long before that research came out, Adobe (’s parent company) had set up its own innovation lab. “Adobe Research” launched in November 1988 with five people who were tasked with helping the company stay ahead of digital trends and technology advancements. Almost three decades later, a team of more than 120, comprising researchers and engineers at Adobe locations around the world, are working on more than 300 research projects per year.

Focus On AI

In 2003, Jon Brandt joined the Adobe Research team as part of an effort to expand the company’s focus on artificial intelligence (AI).

“Think back to 2003,” said Brandt, now senior principal scientist and director of the Media Intelligence Lab within Adobe Research. “This was really the time of the big data craze. More internet-connected devices, software, and services were moving to the cloud, which prompted us to start looking for ways to help our customers act on the heaps of data they have. We knew that AI would be one of the key ways to help our customers do their jobs better, more creatively, and also more intelligently as data sets became bigger and bigger.”

One of the first projects Brandt worked on was enabling automatic red-eye detection and correction within Photoshop. Machine learning, a type of AI that allows technology to learn or change when exposed to new data, enabled the functionality. That was a huge breakthrough for the product, according to Brandt.

See machine learning in action via the video below.

About six years after Brandt came aboard, Adobe announced its $1.8 billion plan to acquire web analytics platform Omniture. The move served as a foundation for the company’s foray into the marketing technology space. It also served as a catalyst for the invention of approximately 40 technologies, which are now part of the Adobe Experience Cloud offering. Nearly all of those technologies are enhanced with AI capabilities, according to Shriram Revankar, VP and fellow of the Big Data Experience Lab within Adobe Research.

Anomaly detection, which studies customers’ online behavior and builds predictive models based on vast amounts of historic data, is an early example of AI in Adobe’s marketing technology. Should something out of the ordinary happen—such as a spike in sales on a typically slow Tuesday—the technology recognizes the unusual pattern and generates alerts. Further, it looks into the huge permutations of data to identify what might have contributed to the spike.

Another early example can be found in Adobe Social. The product features a social sentiment-analysis engine that sifts through billions of pieces of social data to better understand consumer sentiment or emotions around a brand and its products.

“Machine learning, statistical inferencing, and other algorithms have been used in Adobe products for a number of years now,” Revankar said. “Our earliest work was in natural language processing and social media analysis. Sentiment analysis, human emotion detection, and predicting the social reaction to a piece of content yet to be published were those technologies at work. Additionally, anomaly detection was our earliest contribution to Omniture/Adobe Analytics.”

That was just the tip of the proverbial iceberg. In 2012, the entire field of AI transformed, stemming from the opportunities around deep learning, according to Brandt. Deep learning enables a technology’s “brain,” a.k.a. the neural network, to leverage data to create models that continuously perform better and learn as it’s fed more data.

Deep learning also enables content intelligence in Adobe’s Digital Asset Management (DAM) and Adobe Experience Manager (AEM) offerings. That includes the automatic tagging of images and making intelligent recommendations when a user searches for images. It also permits AEM to make smart suggestions about how content added to the queue should appear on mobile devices.

Enter Adobe Sensei

We’re nearly back to current day. Abhay Parasnis, Adobe’s executive vice president and CTO, joined the company in early 2016. He said he was amazed by all the AI innovation coming from Adobe Research, as well as from other teams around the organization. From that observation he pushed for the concept of “Sensei,” bringing all of Adobe’s AI and machine-learning innovation under one umbrella.

“What struck me right away when I joined Adobe were the massive stockpiles of data we had across our creative, document, and marketing clouds, developed over decades and at a scale unparalleled in the industry,” Parasnis said. “By using AI and machine-learning technologies, we could garner insights from this data that could not only transform how businesses operate but also amplify human creativity.”

Last November, Parasnis’ concept became reality. Sensei’s promise? To improve the design and delivery of digital experiences. A combination of machine learning and deep-learning capabilities embedded into Adobe technologies, Sensei tackles many of today’s complex experience challenges that go well beyond the creative department.

Watch Adobe’s CEO, Shantanu Narayen, explain why Adobe is betting on AI.

Since its launch, many Sensei capabilities and features have been added across all three of Adobe’s clouds: Adobe Experience Cloud, Adobe Creative Cloud and Adobe Document Cloud.

In Adobe Creative Cloud, Sensei makes Adobe Photoshop completely face-aware. That includes the product’s ability to find faces in images; identify features, such as eyebrows, lips, and eyes to understand their positions; and enable the facial expression to be changed without ruining the image.

For the Adobe Experience Cloud side of the business, Sensei powers intelligent audience segmentation to give marketers and analysts visibility into which segments are most important to their businesses. It also enables attribution by algorithmically determining the impact of different marketing touch points on consumers’ decisions to engage with a brand. In addition, Sensei determines the effectiveness of different campaigns and makes optimized marketing investment recommendations.

Most recently, Sensei was integrated into the voice analytics capabilities in the Adobe Analytics Cloud, which is part of the Adobe Experience Cloud. Voice analytics has traditionally involved cumbersome, manual analysis. The integration of AI and machine learning into voice analytics helps companies identify trends within enormous data sets of voice-enabled interactions on platforms such as Amazon Alexa, Apple Siri, Google Assistant, Microsoft Cortana, and Samsung Bixby**.**

In Adobe Document Cloud, Sensei powers natural language processing to allow for text understanding and sentiment analysis of digital documents. Sensei-intelligent services unlock the intelligence that lives in documents and extracts meaning that can be searched, analyzed, and incorporated into digital workflows.

“We believe it is important for us to focus our AI and machine-learning efforts specifically on the three areas where we have deep, domain-specific expertise,” Parasnis said. “This laser focus on creative services, document services, and experience [marketing] services sets us apart from what others in the industry are doing with AI.”