Cloud-Native Sisense Cost Reduction Advantages

Money Matters

Cost is always a factor when planning out a project or choosing a software platform. The more computing power, storage space, and machines you need to spin up, the more costs you’ll have to plan for. Linux is one of the most popular operating systems for cloud-based systems, and Cloud-Native Sisense on Linux is designed to not just build and seamlessly integrate powerful analytic apps into the infrastructure as code (IaC) of choice on any cloud, but also provide significant cost savings.

This white paper will dig into how Sisense does that, starting with Cloud-Native Sisense’s strengths relating to costs, then breaking down some sample numbers for a Cloud-Native Sisense on Linux deployment.

How Cloud-Native Sisense Reduces Costs

The Sisense architecture on Linux presents a modern cloud-native design that helps lower the TCO (Total Cost of Ownership) for your Sisense deployment. A multi-node, high-availability deployment of Sisense on Linux could cost about half as much as the same deployment using the Windows architecture. Major contributors to the cost reduction are lower cost per machine, autoscaling, and improved machine utilization.

Lower Costs of Linux vs Windows

It’s very common for cloud providers to offer Windows and Linux options when customers are putting together a package, usually choosing their OS based on what their IT and data engineering teams are most familiar with (AWS, Azure, and Google Cloud all offer both Linux and Windows OS machines). Just going with whatever OS your team is most comfortable with can seem like a simple choice, but financially it’s a huge decision: While pricing for Windows machines includes licensing and maintenance fees that go to Microsoft, the costs for machines running open-source Linux don’t. Linux being open-source and thus free of licensing fees can cut your hourly cost to about half the price of using the same machine running Windows on all of the major cloud vendors.

Autoscaling for Peak Times

The great promise of cloud computing and storage was that tech companies could get the power and storage they needed, when they needed it, without paying for machines they didn’t need at other times. Here, Sisense’s cloud-native architecture comes into play again: The Kubernetes orchestrator in a Linux high-availability deployment allows a new kind of resource planning, taking into account peak Sisense usage times.

Peak usage can take a variety of forms. First off, you could be seeing a large number of ElastiCube data models being built at once, creating a demand for extra CPU and RAM resources. Multiple people all working on data models at the same time is almost impossible to plan for. Alternatively, it’s common to see dashboard views spike at the end of the month, which requires extra query nodes. With Linux autoscaling and Sisense, there’s no need to deploy resources in advance (that will be idle most of the time) just to support max usage at peak times. This results in large cost savings versus reserving fully dedicated machines or adding permanent RAM and CPU resources to query and build nodes. It’s a decision that makes the most of the unique capabilities of Linux and Sisense.

Maximizing Resource Utilization

In some cases, a machine’s resources (RAM or CPU) are not fully utilized, but the maximum number of supported ElastiCubes per server is the limitation. In those cases, the Sisense Windows architecture (which supports a maximum of 40 ElastiCubes per server) requires deploying more nodes, even if RAM and CPU utilization is low. Cloud-native Linux architecture allows a single node (query or build) to support up to 200 ElastiCubes, meaning that the same number of cubes can be run on one-fifth of the machines a Windows deployment would require. This results in cost savings of x2 to x5, depending on the data sizes of the cubes in the deployment.

Running the Numbers

To drive home the ways a cloud-native Sisense deployment on Linux improves performance and saves money, we thought we’d walk you through the exact ways in which this configuration could work. This breakdown looks at the cost savings of a Linux deployment versus a Windows deployment, analyzing a customers’ existing Windows deployment versus a potential Linux deployment.

The customer had a large Windows deployment, with 8 machines: 4 strong machines for the build node and query nodes, and 4 weaker machines for running the application nodes, deployed as follows:

We estimated the annual cost to run these nodes on AWS to be about $140,000:

We evaluated the system’s performance and load, including checking the system usage over the last 30-day period. We looked at the actual query concurrency of the system during the period, as well as system usage patterns. We found that Sisense could support the same load on Linux, using slightly fewer machines, as the application nodes which are not resource-intensive could run on the query nodes. The deployment would look like this:

The new estimated annual cost to run these nodes on AWS would be about $52,000:

Next, we analyzed usage patterns and determined distinct cyclicity:

  • Large differences every 12 hours, indicating different usage between night and day.
  • Usage mid-week was much higher than the usage over the weekend.
  • Only a few days of the month showed peak usage (about 20% of the time).

This type of usage almost perfectly utilizes the cloud-native Linux auto-scaling feature, allowing Sisense to support higher performance at key times by scaling up the deployment only when needed. Our research shows that the new architecture could support 5 times the scale, while increasing the cost by less than 2 times. The suggested deployment used reserved instances for the basic deployment, and more expensive on-demand instances for the peak times. This type of flexible hardware purchase is supported by the cloud-native Linux deployment.

The suggested deployment looks like this:

The new estimated annual cost to run these nodes on AWS would be about $90,000:

As you can see, this analysis shows just how much some businesses could save with a cloud-native Sisense on Linux deployment. Keep in mind: this analysis only includes the machine costs on AWS. It doesn’t include additional AWS fees such as support fees, storage fees, I/O, or network charges.

Reaching for the Clouds

Cloud computing is the future. Whether you’re an enterprise managing a migration to put all your apps and data on the cloud (public or private) or you’re a startup creating a Cloud-Native app from scratch, you’ll be building the future of your business on the cloud. Whichever route you are taking, Sisense provides an open, API-driven analytics platform built on modern cloud-native technologies that power application innovation and new analytics-driven digital possibilities.

Go Cloud-Native!