According to Forrester, almost 60% of enterprises in North America currently rely on public cloud platforms, which is five times the percentage than it was five years ago. Even without these figures, it’s quite obvious to any IT professional today that cloud is here to stay, and that we can only expect bigger adoption in the coming years.
Even though the presence of enterprises in the public cloud has massively increased over the last couple of years, two major concerns are still present, and those are security and cost.
Due to continual hacking breaches, such as Capital One a couple of months ago, or the breach of 198 million US voters back in 2017, security remains a burning topic whenever cloud migration is mentioned. (It’s worth noting that most of these hacking incidents were caused by human error or lack of knowledge on how to properly architect and deploy cloud computing environment.)
While cost is another major concern, it doesn’t sound as horroric as a security issue. After all, a bigger cloud bill will impact your business, but it won’t give you negative publicity.
If you’ve been following Jatheon blog lately, you’ll remember our series of cloud security articles. We’ll now go on to explain the costs of public cloud presence (using AWS as an example) and demonstrate why cost shouldn’t be a constraint to your public cloud adoption, especially since there are AWS tools and best practices to help you on the way.
Why is cloud so complex when it comes to cost calculation?
Any enterprise that maintains its own datacenter presence knows this is a daunting and cumbersome task. And yet, the budgeting part of the job is probably the easiest of all the tasks required to set up a proper datacenter environment.
You calculate your capital expenses (CAPEX), such as the initial cost of leasing the datacenter space and racks, the total cost of all hardware that needs to be purchased (servers, storage, network equipment with proper support plans) and required software (if you’re not tied to some existing subscription plans). That’s the initial sum of money that needs to be paid upfront, and only once, when the datacenter is provisioned.
The rest of the budget consists of operational costs (OPEX), or those that occur on monthly/yearly level, such as the cost of electricity, leased network lines, other support plans you need (such as Oracle/Red Hat/Microsoft) or monthly licensing plans.
In such environments, the majority of the costs are CAPEX, and can be easily calculated. In datacenters, hardware is typically refreshed every 3 to 5 years, so you’ll need to count that in as well when making long term plans. But, all in all, with proper planning, your costs will be predictable and straightforward.
When it comes to cloud, public cloud providers follow the “pay as you go” model, which means that the majority of costs are OPEX and that you’ll be paying a monthly bill for services you consume. That monthly bill will be a sum of all used services in a month in all geographical regions where you have cloud presence.
Let’s take AWS as an example, and say you’re a US company with cloud presence in two regions, us-east-1 and us-west-2. Your monthly bill will be a sum of all services consumed in these two AWS regions. Sounds pretty straightforward, right?
In theory, but not so much in practice. This is because each AWS service has its own dedicated pricing model, and in order to predict your bill with confidence, you need to know all the ins and outs of pricing for each of the services you use.
Since an average AWS customer uses more than a dozen of compute, storage, networking, database and other services, it turns out it’s not that easy to predict your cloud bill.
And while data centers come with an easier budgeting (but many headaches when it comes to deployment and maintenance) cloud tends to make our life harder regarding costs. The following sections will give you some tips and tricks on how to nail down your AWS spending.
Where and how to start your AWS budget tracking if you’re already in the cloud?
If you already have a presence in Amazon’s public cloud, your first step is to head over to AWS Billing and start analyzing your current and previous month’s bills. On this page, you’ll see how much each of the services you use (plus the resources it consumes) is costing you per month.
For example, if you’re using EC2, besides the total number of usage hours of EC2 instances per month in each region, you’ll also find the total EBS storage consumed, as well as costs tied to load balancers or NAT gateways, if used.
Once you start to analyze these numbers, be sure to follow the resource pricing page (each AWS resource has a dedicated pricing page, e.g. EC2) as well as the dashboard page in your AWS console for the resource itself, and then try to map the connection between what resources you use, how much their initial price is and how much you actually spend each month.
Once you create this connection and know how much each individual resource is taking up in your total budget, you will be able to figure out both “big spenders” and unutilized resources.
And if you’re not?
For those just starting their AWS journey, we suggest a slightly different approach. Before deploying your AWS workloads, draw up a blueprint of all required resources and how they’re connected together. Then go through each individual resource (AWS service) and figure out the cost per service.
Let’s illustrate this with a practical example. Imagine you’re starting an online business, and you want to deploy a fully redundant and highly available WordPress site, which might tomorrow host your online shop. For that kind of purpose, you would need an architecture similar to this one on AWS:
When users visit your website, AWS Route53 service would resolve your domain’s DNS queries, and route your visitors first to AWS Elastic Load Balancer. Using the round robin algorithm, ELB will direct users to one of the two application instances (hosted in AWS EC2). Two EC2 instances are located in different availability zones, for the sake of fault tolerance. EC2 instances would store relational data in multi-AZ RDS database (also present in two AZ, for high availability), while other data such as pictures/uploaded files would be saved in S3 bucket, for cost optimization and delivery speed.
Once you have you design laid out like this, it’ll be much easier to predict the costs. The only decision left is which instance types to pick for EC2 and RDS. This is something you’ll need to figure out by yourself, but if the business is brand new, start with smaller instance types, to lower costs, and then upgrade as your business grows.
For our calculation, we’ll use t3.small for EC2 instances, and db.t3.small for RDS multi-AZ cluster, and we’ll assume that the entire environment is provisioned in us-east-1 region. We’ll also assume that both EC2 and RDS instances use 20 GB EBS volumes for storage, per instance, and that we won’t need more than 100 GB of storage in our S3 bucket.
|AWS EBS (EC2)||20 GB gp2 storage||2||$2||$4|
|AWS RDS (MySQL, multi-AZ)||db.t3.small||1||$48.96||$48.96|
|AWS EBS (RDS)||20 GB||2||$2.3||$4.6|
|AWS S3||100 GB standard storage||1||$2.3||$2.3|
|AWS ELB||ALB, 1 LCU||1||$21.96||$21.96|
What needs to be added to this amount is data transfer costs for EC2, RDS and S3, which cannot be estimated at the beginning, and should be monitored on a monthly basis (details on data transfer costs for each service are available on the pricing pages in the table above), and snapshot costs (which are a must for RDS if you use multi-AZ setup, but it’s up to you how many snapshots you’ll keep).
When the entire sum is determined, don’t forget to add your local VAT. What should be noted is that these prices are based on the on-demand usage, which can be reduced if you opt in to use reserved instances for both EC2 and RDS. This could bring your yearly bill for these two services 30-40% down. But then you need to pay all of your EC2 and RDS costs upfront for the entire year (or even three, to save even more money). Just don’t forget that each service has its own specifics when it comes to billing, so be sure to drill down all the costs for specific resource once you start using it, the same as we did in this example.
Best practices for controlling cloud costs
Before we jump into AWS tools for billing and cost management, we’ll summarize some general best practices on how to optimize your cloud costs:
- By using orchestration and automation tools, such as AWS CloudFormation or HashiCorp Terraform, you’ll minimize the risk of deploying unnecessary resources or resources of inadequate size, since templating plays a big role in automation.
- Always take advantage of the cloud provider’s discounts, such as reserved instances or pre-purchased capacity.
- Start small and scale according to your needs or utilize auto-scaling features of services that support it.
- Monitor each service that you use with as many KPIs as possible to determine underutilized resources or those that can be used for more workloads. For example, if you have an EC2 instance that’s consuming 80% of CPU, but just 10% of RAM, consider switching to CPU-optimized instance with less RAM, to lower your monthly cost for that instance.
- Read all storage and data transfer pricing for each service very carefully. Cloud consumers typically care about the price of the service itself (its hourly rate) while they neglect storage and data transfer costs. These are usually cheap, but in some cases with improper configuration, these two categories can ramp up your monthly bill.
- Be careful when you use Spot instances and explore them in detail them before usage.
- Use cheaper and slower storage for backups and snapshots.
- Take advantage of advanced AWS services, such as serverless computing (AWS Lambda), containers (AWS ECS / Fargate / EKS) and serverless databases (Aurora, DynamoDB)
AWS tools that will help you in your daily cost struggle
As many customers still struggle with cost management, developers at AWS have created numerous solutions that will help you with your budgeting and cost management tasks.
This tool gives you the ability to create custom budget alerts when an individual or a group of services exceeds or is close to exceed your budgeted amount of money. AWS Budgets can also be used to monitor utilization thresholds, so you can easily figure out when a service is over or underutilized. At the moment, reservation alerts are supported for EC2, RDS, RedShift, ElastiCache and Elasticsearch reserved instances (that also means that EC2 instances reserved for ECS / EKS / EMR are also monitored by this tool).
AWS Cost Explorer
The main idea behind AWS Cost Explorer is to visualize your AWS monthly bills over time, and to easily find trends in a series of custom, user-generated graphs. With AWS Cost Explorer, you can present the trends in your AWS billing to your business stakeholders. If you’re a large enterprise that utilizes multiple AWS accounts, you can aggregate data with Cost Explorer as well.
AWS Cost & Usage Report
AWS Cost & Usage Reports are detailed reports of your spending from all of your accounts, with each service explained up to the last bit. These reports are delivered in .csv format to an S3 bucket of your choice, and they can be generated on a monthly or daily basis, based on your needs.
Once the report is generated, you can open it in Microsoft Excel or any other compatible .csv reader, or ingest it into AWS Athena or AWS RedShift for future analysis. You can also use these .csv reports with your other BI tools on premise, since the format can easily be converted to support SQL queries.
AWS Total Cost of Ownership (TCO) calculator
We can’t vouch that this tool is 100% accurate, but AWS TCO calculator should give you an estimate on how much you would pay if you deployed your current on-premise infrastructure on AWS. The comparison includes OPEX such as price of physical facilities, power, cooling, network costs, but those you need to input by yourself. Take the end result as an approximation – some useful information in your migration process.
AWS Pricing Calculator
What we did for you in our example can also be done by using the AWS Pricing Calculator. This tool is getting upgraded and improved almost every year, and it has come a long way since it was introduced in 2007. Besides putting in all the services you need, don’t forget to create an estimate for data transfer costs as well.
Jatheon Cloud is a fourth-generation, cloud-based email archiving platform that runs entirely on AWS. To learn more about how your organization can meet email compliance, improve ediscovery and transform its data request processes using Jatheon’s cloud archive, contact us or get a personal demo.