An often overlooked aspect of a company's journey to the cloud is cost visibility. While the single number delivered by the cloud provider on a monthly invoice is straightforward, understanding where this number comes from is often more tricky. Fortunately, this task can be facilitated through the usage of various cost monitoring tools available on the market, coming from both third-party companies and the cloud providers themselves. Such tools not only provide the context about the current expenses but also give the possibility to forecast future spending. The generated insights are often eye-opening and sometimes can change the priority focus for the whole company.

The typical journey of analyzing cloud costs starts with dissecting the bill, breaking it down by usage type, services, or the teams that own them. By analyzing the trends on such reports, we can observe where our spending increases over time.

Unfortunately, looking at raw dollar trends doesn't show us the full picture, as they fail to answer the question of whether the cost increase was justified or not. This is because raw dollars on a bill only represent a certain expense and don't carry the notion of a corresponding value delivered to the company. For instance, a 50% increase in the costs of a certain service might not inherently be a bad thing, when the number of customers has at the same time increased by 100%.

To address this challenge, instead of looking at the absolute costs, it's better to correlate them with the delivered value through unit economics. This article describes how we use the Sumo Logic Continuous Intelligence™ platform to provide unit economics monitoring to improve our cloud cost-efficiency-and how you can do it too!

What is unit economics?

Unit economics is a method of representing the costs relative to a metric directly connected to the company's revenue model. For SaaS companies, this metric is often the number of customers, but other dimensions, such as the amount of processed data, or the number of transactions, can also be used. Unit economics can be used to express cost-efficiency, that is costs incurred per a single unit of measurement, hence the name. Since the resulting dimension is directly linked to the size of the business, the trends in unit economics can be easily analyzed over time without any additional context. Ideally, we would want to drive this number as close to zero as possible.

Unfortunately, in practice, it's still non-trivial to monitor cloud costs unit economics in a near-real-time manner. While many tools enable an in-depth analysis of the cloud bill, most of them don't provide the opportunity to visualize unit economics.

A couple of years ago we saw that having near-real-time monitoring of unit economics would help our engineering teams better understand their cloud spend and lead us to improve our overall cost-efficiency. Since none of the available solutions met our requirements, we decided to create internal tooling for this purpose, built on top of the Sumo Logic platform.

Sumo Logic's approach to unit economics

At Sumo, the metric we use at the base of unit economics is the number of logs ingested by our customers. Our data pipeline continuously reports its ingestion statistics in a form of loglines fed into an internal organization in one of our deployments, making it easy to query them later on.

The core element of our unit economics monitoring is a script, which retrieves both AWS billing information and the customer ingestion statistics via Sumo Logic API. The costs are then grouped into several dimensions (service, owning team, deployment, etc.), and subsequently divided by the corresponding ingestion size. The result of this operation is the cost per gigabyte (GB) of ingest for every component of our system. These efficiency numbers, together with their identifying dimensions, are then formatted as well-structured loglines and ingested back into Sumo Logic. Such loglines feed several dashboards, which display our costs per GB of ingest with different levels of granularity.

The script described above is run as a Jenkins job every couple of hours, multiple times a day. By design, each execution of the script outputs the complete history of data (last 30 days with daily granularity, last 12 months with monthly granularity), instead of just the most recent data point. This incorporates any changes to cost classification. For instance, assigning a service to a new engineering owner would not only reassign future costs, but also those in the past. The dashboards load almost instantaneously, as only data for the last couple of hours needs to be queried to retrieve the entire cost history.

Using multiple Sumo Logic dashboards with different levels of granularity enables us to present this information to different stakeholders without overloading them with unnecessary details. The screenshot below shows a fragment of one of our main dashboards targeted at engineering, presenting an overview of our per-unit cost-efficiency. It focuses on the high-level metrics (cost-efficiency per team and deployment) and therefore is often the first place to start before diving into more detailed views. We also use a dedicated dashboard for each engineering team, giving them more detailed insights into the cost per GB of the components for which they are responsible.

A snapshot of one of our main dashboards that focuses on high-level metrics before diving into more detailed views.

Another pillar of our unit economics monitoring solution is the weekly mail alerts that are sent out to the engineering owners of each architectural area. Via these recurring updates, they can track the latest cost-efficiency trends in their area of interest and follow up with their team to quickly react to any potential regressions. A screenshot of one of such emails is shown below.

Both the weekly notifications and the dashboards help our engineering teams to understand and follow the trends in the cost-efficiency of their fragment of the infrastructure. Following the old saying that you only can improve what you measure, by surfacing these statistics and forwarding them to their owning teams, we can empower them to make educated decisions and accordingly prioritize the cost optimization projects, making sure we are where we want to be in terms of cost-efficiency.

Five steps to get started with unit economics

Applying unit economics to cloud computing gives us greater visibility and thereby control over our costs. It's easy to get this kind of granularity from your data, too. Here are a few easy steps to get started with unit economics monitoring in your organization.

  1. Determine a metric to use at the base of unit economics. For example, we used the number of logs ingested by our customers.

  2. Choose the dimensions to cut data, for example, service, owning team, deployment, etc.

  3. Pull billing data and correlate it with the dimensions above.

  4. Divide the correlated billing data by the metric from step one to get the per-unit cost-efficiency for the desired dimensions.

  5. Ingest this data into Sumo Logic and use it to create dashboards and alerts.

Summary

Here are a few points to remember as you apply unit economics to your cloud computing:

  • Cloud cost visibility is not only about the absolute spend. Looking at the costs relative to the size of the business often shows a much broader picture.

  • Popular cost monitoring tools typically do not provide the means to track unit economics.

  • As part of our Continuous Intelligence Platform, Sumo Logic Application Observability can easily be used to address unit economics monitoring. We're doing it here at Sumo with great success!

Complete visibility for DevSecOps

Reduce downtime and move from reactive to proactive monitoring.

Start free trial

Attachments

  • Original document
  • Permalink

Disclaimer

Sumo Logic Inc. published this content on 09 November 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 10 November 2021 01:44:04 UTC.