The cloud isn’t the future; it’s right now. In the Clouds is where we explore the ways cloud-native architecture, cloud data storage, and cloud analytics are changing key industries and business practices, with anecdotes from experts, how-to’s, and more to help your company excel in the cloud era.

The world of data is constantly changing and speeding up every day. Companies are storing more types of data from applications as well as the Internet of Things. This data flows into cloud-native warehouses where data teams manipulate it, allowing analysts to derive vital insights from it, and product teams embed those insights into products. Data is the bedrock on which the future of business is being built.

As the data that these businesses need to thrive continues to grow and its pace of change accelerates, it’s never been more important for employees at all levels of an organization to have fast access to actionable data in order to make strategic, operational, and tactical decisions. It’s both basic table stakes for success in the “new normal,” as well as a defining edge that companies can use to stay ahead of the competition.

Sisense for Cloud Data Teams

Saving lives in real-time

Easier access to fast-updating datasets isn’t just about making better decisions or powering the next killer app. It also can also save lives and change the way the world works.

“There are a wide range of scenarios where having super-fast access to real-time data can make a huge difference,” said Christelle Scharff, a professor and computer scientist based at Pace University in New York. “Fast access to data captured by video surveillance systems, for example, can improve security… It’s also the driving force behind autonomous cars. Our biggest industrial firms can use it for preventative maintenance — saving potentially millions of dollars. And almost all organizations can use it to avoid potential threats from security breaches and malware attacks.”

The success of COVID-tracing efforts will depend on fast access to multiple data sources.

George Thiruvathukal, professor of computer science at Loyola University

During our current pandemic, access to real-time data can also save lives. “Health officials are investigating how contact-tracing apps can help manage the ‘reopening’ after we begin to reopen the country after the COVID-19 lockdown,” said George Thiruvathukal, professor of computer science at Loyola University in Chicago. “The success of this will depend on fast access to multiple data sources.”

There’s an impact on customer expectations too. According to recent research from IDC, consumers are embracing personalized real-time engagements and resetting their expectations for data delivery. As their digital world overlaps with their physical realities, they expect to access products and services wherever they are, over whatever connection they have, and on any device. They want data in the moment, on the go, and personalized. As a result, IDC predicts that nearly 30% of the global datasphere will be real-time by 2025.

Pressure on infrastructure builds

As enterprises demand data infrastructures that can meet this growth in real-time data — and ultimately assist with their product differentiation strategy — the pressure put on product teams is huge.

Product teams are already having to manage the growing complexities that come with modern data environments.

Chandana Gopal, Business Analytics Research Director, IDC

“Product teams are already having to manage the growing complexities that come with modern data environments,” said Chandana Gopal, research director for business analytics at IDC. “Not only do they have to deal with data that is distributed across on-premises, hybrid, and multi-cloud environments, but they have to contend with structured, semi-structured, and unstructured data types. Multiple technologies to manage data at rest and in motion have compounded the challenge of managing data and making it accessible to decision-makers in the right time, in the right format, and in the right context.”

Managing cloud data is a key challenge for data and product teams who are tasked with connecting to a wide array of datasets stored in cloud-native warehouses and other locations. In a large-enough company, there can even be multiple clouds being operated by different divisions and teams. BI and analytics providers have had to design their platforms to serve up fast insights no matter where the data being analyzed resides, even partnering with third-party companies to make sure that their platforms can handle data from oft-used services like AWS, Google Cloud, and others.

Which makes sense, as customers searching for an analytics solution are also often grappling with their recently-purchased or possible cloud options:

“When customers come to us for their BI and analytics needs, in the same sentence they’re often telling us that they’re considering their cloud options,” said Erin Winkler-McCue, Lead for Strategic Partnerships & Special Projects at Sisense. “These conversations are no longer siloed. Customers want to know that our platform will work seamlessly with their chosen cloud vendor, even if that just means something as basic-sounding as making sure queries between the vendor and Sisense are optimized.”

The challenges these teams face become even more daunting when one looks towards the future, as new technologies like the internet of things, machine learning, 5G, and augmented reality will add a new level of demand. Forbes Insights data shows that in order to benefit from emerging technologies like these, 92% of CIOs and CTOs say their business will require faster download and response time in the near future. What’s concerning is that, despite recognizing this, just 1% of data center engineers believe their data centers are updated ahead of current needs.

Linux TCO white paper CTA

Competing priorities within companies

All of this results in a lot of friction within data-driven organizations. “Multiple technologies are required for managing, integrating, and controlling the flow and consumption of data from the edge to the cloud and all points in between. That’s without mentioning outdated metadata—the data about data that provides data intelligence,” said Gopal.

An upcoming skills gap might compound the problem: According to the Forbes Insights research, 37% of engineers say they will likely retire in the next 10 years.

Adding to this hurdle is the fact that some firms are led by executives that don’t understand or champion the importance of having contextual and timely data embedded into applications. Recent research by Exasol has found that less than half of decision-makers believe that those working in senior management (40%) or mid-management roles (32%) are very effectively informed of their organization’s data strategy.

Creating a path to success

Gopal believes that future success requires that data teams take a structured approach focused on people, processes, and technology in order to make data available to all.

“Data teams should identify short- and long-term data and analytics use cases that will demonstrate business value with input from stakeholders at all levels—both business and IT,” she said. “They should also identify data-related assets that will be required for the project and be realistic about time constraints. They should then look to deliver measurable value with short term projects to build business cases for more expensive or longer projects.”

Teams should look to deliver measurable value with short term projects to build business cases for more expensive or longer projects.”

Chandana Gopal, Business Analytics Research Director, IDC

From a technology perspective, the introduction of new technologies, such as 5G-enabled edge computing, will have an impact on IT staffing. According to the Forbes Insight report, almost three-quarters (74%) of C-suite executives believe staffing will be reduced or handled by external cloud or edge service providers. The ability to implement new technologies like these in the data center will be a competitive differentiator, as will better security (according to 43% of respondents), and bandwidth (according to 27%).

Ultimately, it’s clear that organizations need to act quickly if they want to succeed. “Continuous efforts to update the data center will be integral to business success,” states the Forbes Insight report. “Partnering with external third parties is a central part of the data center journey in the age of hyper-connectivity.”

These collaborations are happening between companies and their cloud providers and between platforms like Sisense and companies like Amazon and Google:

“Partnerships with companies like AWS, Snowflake, Microsoft, and Google are only becoming more important as the modern data landscape evolves,” said Erin Winkler-McCue, Lead for Strategic Partnerships & Special Projects at Sisense. “We feel like every customer is either already in the cloud or it’s only a matter of time until they contemplate setting up a hybrid- or full-cloud model.”

Gopal agrees that adopting new technology through new partnerships is key: “A new class of intelligent data operations platforms are emerging that can reduce friction, improve efficiencies with automation, provide flexibility and openness with policy and metadata-driven processes that can accommodate the diversity and distribution of data in modern environments,” she said. Equipped with these, product teams will be much better prepared for a new and exciting future.

Sisense for Product Teams

Lindsay James is a journalist and copywriter with over 20 years’ experience writing for enterprise business audiences. She has had the privilege of creating all sorts of copy for some of the world’s biggest companies and is a regular contributor to The Record, Compass, and IT Pro.

Tags: | | | |