There's no denying that cloud computing is gaining more and more traction; however, a less known - but staggering - prediction is that a significant chunk (65.9%) of all enterprise software spending will go to cloud technologies in 2025 from less than 60% in 2022.
Unfortunately, for many companies, the cost implications of cloud technologies have not been sufficiently addressed, even with the rising number of FinOps solutions dedicated to reducing cloud spend.
On the Google Cloud Platform (GCP), Cloud Logging is a dedicated service that collects, organizes, analyzes, and monitors log data, making it easier to troubleshoot and monitor applications, infrastructure, and services.
With the rising complexity of cloud systems and the importance of effective logging and monitoring, GCP Cloud logging plays a crucial role. The downside is that, as with any cloud-based service, they come at a steep cost when left unmanaged, contributing significantly to your overall cloud spend.
This article will provide insight into GCP Cloud logging pricing model, cost optimization techniques, and the right solution to manage costs and maximize your cloud spend.
This is one of our most popular articles. Don’t miss another highly viewed piece on What is Datadog—read it here.
Table of Contents
- GCP Cloud Logging Pricing Model
- Factors Affecting Cloud Logging Pricing
- Cost Optimization Techniques
- Optimize Cloud Logging Cost
GCP Cloud Logging Pricing Model
Like many cloud services across different platforms, the pricing model of GCP Cloud Logging can contribute significantly to the overall cost of your cloud operations.
GCP Cloud Logging pricing is based on two critical components: logging ingestion and logging storage.
- Logging ingestion: This occurs when log data is sent to GCP Cloud Logging through a log agent or API, so the total cost of log ingestion entirely depends on the volume of log data sent to GCP Cloud Logging. As of April 2023, the price is $0.50 per GB of logs ingested, with a one-time fee for ingestion plus storage for 30 days. The first 50 GB of all ingested logs are free per month for each project.
- Logging storage: This refers to the amount of data stored and its retention period. As expected, longer retention periods result in higher costs, but may be necessary in order to comply with compliance regulations. As of April 2023, the cost is $0.01 per GB per month for logs retained more than 30 days, billed monthly. Logs retained for the default retention period do not incur a storage cost.
Let's see a practical example to illustrate the GCP Cloud logging pricing model better. If you generate 500 GB of log data monthly from a web application, the ingestion cost would be $250 per month ($0.50 per GB) minus the free allotment of 50 GB. To retain the data for 90 days, you would be charged $0.01 per GB per month for the remaining 60 days, resulting in a monthly storage cost of $45 for 450 GB of log data. The total cost for GCP Cloud Logging in this scenario would be $295 per month.
Beyond the pricing model, if you want to export the log data to BigQuery for further analysis, additional fees would apply based on the amount of data exported.
Factors Affecting Cloud Logging Pricing
Without adequate visibility and control, the cost of GCP Cloud Logging may become very expensive for your business. This can be caused by several factors, including:
- High Volume of Logs: Having a large number of logs across various services increases storage and processing costs. As log data is ingested and stored in GCP Cloud Logging, the costs associated with it can rise as the volume of log data increases.
- Long Retention Periods: Storing logs for extended periods can increase storage costs, so you should regularly assess your retention policies to help you keep costs under control.
- Inefficient Filters: Ingesting unnecessary logs that don't provide valuable insights will likely increase cost. As more log data is ingested, the storage and processing costs associated with that data also increase.
- Lack of Log-Based Metrics: Not using log-based metrics requires processing raw logs, which can even be more expensive. By leveraging log-based metrics, you can gain valuable insights into their cloud infrastructure and applications without incurring the costs of processing raw log data.
Cost Optimization Techniques
There are many ways in which you can reduce or optimize your GCP Cloud Logging cost, which includes setting up exclusions for logs that are not relevant or redundant, and creating log sinks to route logs to specific destinations. You can even employ more advanced cost control strategies such as fine-tuning retention policies based on log type and frequency of access, and using custom metrics to monitor and alert on unusual log patterns.
Set Exclusion Filters
You can use the GCP Console to set up an exclusion filter that excludes specific logs from being ingested into GCP Cloud Logging. To set up an exclusion filter, navigate to Logging > Logs Router and click "Create Sink" to create a new sink. In the "Create sink" dialog, give your sink a name and select a destination.
If you already have a sink in place, you can edit it to add the exclusion filter. After that, then “Choose logs to filter out of sink”. Create an exclusion filter with the logs you want to exclude. For instance, you could exclude logs from a specific resource, logs with a specific log level, or logs containing specific text. You can even specify the conditions for the logs you want to exclude, since the filter is written in a query language called the Logging Query Language.
```
resource.type="gce_instance"
AND
resource.labels.instance_id="INSTANCE_ID"
```
By creating exclusion filters, you can avoid ingesting unnecessary logs, reducing storage and processing costs associated with that data. With regular reviews and updates of these exclusion filters, you can ensure that you are only ingesting the logs that provide valuable insights into your cloud infrastructure and applications, while minimizing their GCP Cloud Logging costs.
Export Logs to Cheaper Storage
Consider exporting logs to cheaper storage options like Google Cloud Storage or BigQuery. By doing this, you can use more affordable storage options to save money while retaining access to your log data.
To export logs to Google Cloud Storage or BigQuery, you can configure a Log Sink in the GCP Console. A Log Sink allows organizations to specify where log data should be exported and how it should be formatted. By exporting logs to Google Cloud Storage or BigQuery, you can take advantage of these services' more affordable storage options while still retaining access to their log data.
In addition to the cost savings associated with exporting logs to cheaper storage options, you can also leverage the powerful analytics offered by these services to gain deeper insights into your log data. By using BigQuery to analyze log data, for example, you can easily identify trends and anomalies in your cloud infrastructure and applications, leading to more efficient monitoring and troubleshooting.
Set Log Retention
Another way to optimize GCP Cloud Logging costs is to consider the retention period for logs stored in Cloud Logging. Logs are typically retained for 30 days and don't incur storage costs during this period. However, retaining logs for longer than the default retention period can result in increased storage costs.
To reduce log storage costs, you can carefully consider and adjust your retention policies as needed. For example, suppose you don't require logs to be retained for longer than the default retention period. In that case, you can modify your retention policies to ensure that logs are automatically deleted after 30 days.
Use Log-based Metrics
In GCP Cloud logging, you can cut costs and reduce the need for long-term storage and processing by creating metrics from logs. You can aggregate and summarize log data, which reduces the amount of raw log data to store and process, particularly if you generate a high volume of log data.
This also means you can customize your log data based on your specific operational needs. For example, you can create metrics that track specific application events or errors, or even metrics that offer insights on infrastructure performance.
Use Resource-based IAM Policies
Resource-based IAM policies can be used to restrict access to logs in GCP Cloud Logging. By applying these policies, you can reduce the need for expensive log auditing and ensure that only authorized users can access your logs. This can help you meet compliance and regulatory requirements while also reducing your overall costs.
Additionally, resource-based IAM policies can help you maintain better control over your logs, allowing you to better monitor usage and track access.
Optimize Cloud Logging Cost
Logging can be expensive, especially when logging large volumes of data in high throughput. So it's essential to be aware of the costs associated with it, and doing this manually can be highly burdensome. To make it easier, FinOps tools like Finout can be incredibly helpful in identifying wasteful spending and highlighting areas that need improvement.
Finout is a cloud cost monitoring platform that can help you implement the best techniques for optimizing Cloud Logging cost, giving you granular visibility into log usage and data and recommending ways to reduce costs. With Finout's advanced analytics and cost optimization features, you can get visibility into your overall cloud costs across all platforms – beyond just GCP (Azure, AWS, Snowflake, DataDog, etc.) – all from a single dashboard.
So, don't let GCP Cloud Logging costs drain your budget; book a demo today to see how Finout can help optimize your GCP Cloud Logging costs.