AI and machine learning are the future of every industry, especially data and analytics. In Growing Up with AI, we help you keep up with all the ways these pioneering technologies are changing the world.
Reading through the Gartner Top 10 Trends in Data and Analytics for 2020, I was struck by how different terms mean different things to different audiences under different contexts. We hear a lot about AI and analytics not only in internal conversations, but also from our customers and prospects. But what do we really mean when we talk about these issues?
Seeing as how they will only become more important to our world, I thought it would be worthwhile, as Sisense’s head of AI research (AIR), to dive into 7 of the 10 trends on the list and give my views on each.
The article starts with a big statement about AI starting to operationalize, moving the requirements for data and analytics infrastructure to accelerate the development and adoption phase:
“By the end of 2024, 75% of enterprises will shift from piloting to operationalizing AI, driving a 5X increase in streaming data and analytics infrastructures.”
This is a major change in the way AI has been used in the past alongside data and analytics, making both more powerful and effective. Let’s dive into these trends and see what else is on the horizon.
Trend 1: Smarter, faster, more responsible AI
“Within the current pandemic context, AI techniques such as machine learning (ML), optimization and natural language processing (NLP) are providing vital insights and predictions about the spread of the virus and the effectiveness and impact of countermeasures.
“Significant investments made in new chip architectures such as neuromorphic hardware that can be deployed on edge devices are accelerating AI and ML computations and workloads and reducing reliance on centralized systems that require high bandwidths. Eventually, this could lead to more scalable AI solutions that have higher business impact.”
Augmentation and reinforcement learning are much more powerful than out-of-the-box solutions, and this is what’s guiding us along the way. Planning for every feature starts with questions about how the user will be able to play around with and modify the input to see how it affects the result. It was only natural for us here at Sisense to put significant investment into knowledge graphs, NLP, and automated machine learning. Together, they enable users to actively engage with the system, enjoying recommendations along with analysis. These features also facilitate a positive feedback loop, using engagement to strengthen what works and get rid of what doesn’t.
One result is that systems become much more intuitive: Users can take advantage of the “Simply Ask” feature to check “what are my sales next two months” and receive chatbot messages with projected visualizations and suggestions for further exploration routes. In a similar way, the forthcoming “Explanations” feature provides users with possible drivers of the movements in the data automatically, using knowledge graphs to go beyond the boundaries of their charts. This can turn the problem definition environment to multidimensional and learn from the user interaction with the system to personalize and match the results.
From Forecast to Trends to natural language querying, we are completely transparent about the technology behind and the statistical characteristics of the output. Whatever you’re seeing when you use Sisense, you can easily dig into the systems behind it.
Trend 2: Decline of the dashboard
“Dynamic data stories with more automated and consumerized experiences will replace visual, point-and-click authoring and exploration.”
At Amazon, everyone in a meeting sits down at the beginning and reads a full write-up, and then the discussion begins, rather than sitting through an endless PowerPoint presentation during the whole meeting. They focus on real storytelling rather than bullet points. We expect something similar to happen with dashboards: fetching insights-driven digests just in time, but also accompanying the daily routines with an “agent” supporting business flows in various tools.
Do you like to see what you missed first thing in the morning? Be alerted on significant movements? Is an executive summary enough to start the ball rolling, knowing you can always do a deep dive and ask for more? Using your favorite task management solution? The world is moving from the static, rigid experience to the data-, insight-, and personalization-driven assistant that knows how you want specific analytics to be served.
In order to make that work, a number of moving parts need to come together as one well-oiled machine: embedded interfaces (on-the-go via your device, in your email, chat, or in-app), pretrained analytics services and training pipeline, the vehicle to facilitate the data model creation, and the right visualization and narration to make the results digestible, trustable, and learning.
This is what keeps Sisense AIR busy: dashboard automation research and our knowledge graph, which has incorporated the behavior of thousands of past users.
Trend 3: Decision intelligence
“By 2023, more than 33% of large organizations will have analysts practicing decision intelligence, including decision modeling.”
“It provides a framework to help data and analytics leaders design, model, align, execute, monitor, and tune decision models and processes in the context of business outcomes and behavior.”
Decision-making automation requires a lot of steps: First you document the process, then configure it based on the result, then automate the possible parts. My take on it is that if you can automate the loop from data to analysis to decision back to data, it is not analytics, it’s robotic process automation. There’s an argument to be made that once decision-making on a use case becomes predictable, it should be moved from BI to a part of the back office.
But that kind of thinking comes from the world we used to know, a world that was less volatile and more manageable, more influenced by the proximity ecosystem than by world events and climate. Today, the world changes at a speed that’s hard to fathom, so decision-making needs to be adjusted based on insights coming from data, accompanied by recommended actions. “Survival of the fastest” is the rule today.
Trend 4: X analytics
“Gartner coined the term ‘X analytics’ to be an umbrella term, where X is the data variable for a range of different structured and unstructured content such as text analytics, video analytics, audio analytics, etc.”
The world is wider than the traditional BI tabular data. It’s visual, it’s spoken, it’s audible. Why use just one of the senses and limit your perspective?
Sisense recently used our ecosystem of ML service providers to help scan and surface the medical crowd wisdom of COVID treatments from piles of textual data from a site called G-Med. There was no point in reinventing the wheel to build our own video, image, speech, and text analysis tools — there are plenty of those on the market already.
How exactly is all that data going to talk to each other and come together to provide the end-to-end analysis? Knowledge graphs will be the base of how the data models and data stories are created, first as relatively stable creatures and, in the future, as on-demand, per each question.
Trend 5: Augmented data management
“Augmented data management uses ML and AI techniques to optimize and improve operations. It also converts metadata from being used in auditing, lineage and reporting to powering dynamic systems.”
The Gartner article doesn’t go beyond lineage or workload automation. That’s important, but that’s only what’s going on today. Fetching calculation results ahead of the question improves performance, but it’s still limited to the data model or dimensional paradigm of the single individual in the organization. Do they have the required perspective to include hurricane data for the supply chain dashboard for East Asia? Domain experts would likely decide to include that information after reading about losses in the news. What if the relevant data could be added to the context to tell the data story without humans needing to take action themselves? Data exchanges will play a more significant role in the future, extending their offerings to data modeling.
Trend 6: Cloud is a given
“By 2022, public cloud services will be essential for 90% of data and analytics innovation. As data and analytics moves to the cloud, data and analytics leaders still struggle to align the right services to the right use cases, which leads to unnecessarily increased governance and integration overhead.”
Cloud is here to stay. I witnessed the mainframe/PC/cloud/personal graphics processing unit evolution. To me, the tipping point of cloud analytics will be in the “context as a service” combination of data and logic components served based on user questions. With offerings like AWS Outposts, it couldn’t be easier to start the cloud journey.
In the analytics world, it’s crucial to stay up to date, implementing “continuous integration/continuous delivery” systems and A/B testing for better performance and experience. This is only possible with cloud services. Cloud combined with compliance with General Data Protection Regulation and SOC are vital to gain customers’ trust. Data-hungry calculations will be costly to perform in the cloud if data is on-premises due to data gravity and latency. Adjusting a system’s architecture can make all the difference quickly, meaning you can easily pull insights from large datasets.
Trend 7: Data and analytics worlds collide
“Data and analytics capabilities have traditionally been considered distinct entities and managed accordingly. Vendors offering end-to-end workflows enabled by augmented analytics blur the distinction between the two markets.
The collision of data and analytics will increase interaction and collaboration between historically separate data and analytics roles. This impacts not only the technologies and capabilities provided, but also the people and processes that support and use them. The spectrum of roles will extend from traditional data and analytics roles in IT to information explorer, consumer, and citizen developer as an example.”
I agree that new roles are required. As new data and analytics products are built and every product begins to have data and analytics elements in it, data/knowledge product managers will emerge. These specialists will understand data and be able to run and create queries and transformations but will also be knowledgeable about the applications running on top of those data streams.
Regarding data and tools, “extract, transform, and load” (ETL) will become ETLT. The “T” stands for the “transformation pipelines” either bringing data from the exchanges or pre-trained ML services or training pipelines for both structured and unstructured data. Software developers and data scientists can use these same pipelines to deploy their parts of the application, and analytics workflows can be automated to the point where business users can even trigger them without outside help.
AI and analytics: Building the future together
If you have data, odds are you have a lot of it. You’ve probably got more than you can handle. Alone, that is. Only AI will be able to help humans make sense of the huge datasets being generated every day by countless individuals and devices. AI systems will play greater and greater roles in our personal and business worlds, so whatever you’re building, start thinking about the ways AI can help your product, service, colleagues, and customers be better. And whatever you’re working on, build boldly.
Inna Tokarev-Sela, Sisense’s Head of AI Research, has over 15 years’ experience in the tech industry. She spent the last decade at SAP, driving innovations in cloud architecture, in-memory products, and machine learning video analytics. A frequent speaker at industry events like IBC, NAB, Wonderland AI, and Media Festival, Inna holds a BS in physics and computer science, an MBA, and an MS in information systems, having written her thesis on neural networks.