Using Cost Allocation Report on Amazon Web Services (AWS)


AWS offers web services like compute instances. Lately, I have been using one cc2.8xlarge instance for 3 hours on a weekly basis to give training sessions. My 14 students connect to orion.cloud.raytrek.com (a canonical name to my AWS instance) during every training session.The instance has one additional 300 GiB EBS volume attached to it so that my students keep their data for the whole duration of the training program.

On AWS, I can tag anything I use: EC2 instances, EC2 EBS volumes, S3 buckets, and so on. A tag is a key and a value (key=value), for example Project=Ray-Cloud-Browser-public-demo. On AWS, it's possible to activate a feature called Cost Allocation Report. This feature deposits detailed usage reports in one S3 bucket that I own. These reports include costs.

I tagged my Cost Allocation Report S3 bucket with Project=Billing to get a grasp on on much it costs to use the Cost Allocation Report feature. The cost of getting my Cost Allocation Report reports is only $ 0.02.

I wrote a Ruby script that generates pivot tables for my projects. Below is my Cost Allocation Report tables with some confidential information redacted. Things under Project=not-classified are things that were not tagged.

Pivot table for Cost Allocation Report on AWS
File from AWS S3: #####################-aws-cost-allocation-2013-02.csv
Project=ray-in-cloud-cc2.8xlarge-CLI
Product CodeUsage TypeUnitsUsage QuantityTotal Cost ($)
AWSDataTransferDataTransfer-Regional-Bytes GB 2.8E-70.0
AWSDataTransferDataTransfer-In-Bytes GB 0.000060330.0
AmazonEC2SpotUsage:cc2.8xlarge instance-hours 10.000000002.7
AWSDataTransferDataTransfer-Out-Bytes GB 0.000020290.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.000044145.0e-06



Total=2.700005
Project=Ray-Cloud-Browser-##############
Product CodeUsage TypeUnitsUsage QuantityTotal Cost ($)
AWSDataTransferDataTransfer-Regional-Bytes GB 0.000047656.0e-06
AWSDataTransferDataTransfer-In-Bytes GB 0.000002860.0
AWSDataTransferDataTransfer-In-Bytes GB 8.656911690.0
AmazonEC2SpotUsage:t1.micro instance-hours 338.751381221.017348
AmazonEC2EBS:VolumeUsage GB-months 36.334022093.633367
AmazonEC2EBS:VolumeIOUsage I/O requests 3861215.847887990.385104
AWSDataTransferDataTransfer-Out-Bytes GB 0.000002300.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.181801450.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.000005001.0e-06
AWSDataTransferDataTransfer-Out-Bytes GB 0.395612710.047268



Total=5.083094
Project=formation-#############-bioinformatique-hiver-2013
Product CodeUsage TypeUnitsUsage QuantityTotal Cost ($)
AWSDataTransferDataTransfer-Regional-Bytes GB 0.000452776.0e-05
AWSDataTransferDataTransfer-In-Bytes GB 3.909986890.0
AmazonEC2BoxUsage:cc2.8xlarge instance-hours 10.0000000024.0
AmazonEC2EBS:VolumeUsage GB-months 145.4776285614.547622
AmazonEC2EBS:VolumeIOUsage I/O requests 267824.572144110.026712
AWSDataTransferDataTransfer-Out-Bytes GB 0.102163410.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.222314740.026562



Total=38.600956000000004
Project=not-classified
Product CodeUsage TypeUnitsUsage QuantityTotal Cost ($)
AWSDataTransferDataTransfer-Regional-Bytes GB 0.074354980.009904
AWSDataTransferDataTransfer-Regional-Bytes GB 0.000006451.0e-06
AWSDataTransferDataTransfer-In-Bytes GB 0.001651900.0
AWSDataTransferDataTransfer-In-Bytes GB 0.043491390.0
AWSDataTransferDataTransfer-In-Bytes GB 0.042701760.0
AWSDataTransferDataTransfer-In-Bytes GB 0.043251740.0
AWSDataTransferDataTransfer-In-Bytes GB 5.474990610.0
AWSDataTransferDataTransfer-In-Bytes GB 0.000001970.0
AWSDataTransferDataTransfer-In-Bytes GB 0.000185850.0
AmazonS3Requests-Tier1 HTTP requests 26.426229510.001066
AmazonEC2BoxUsage:cc2.8xlarge instance-hours 1.000000002.4
AmazonEC2SpotUsage:t1.micro instance-hours 1.026519340.003083
AmazonEC2SpotUsage:t1.micro instance-hours 250.470718230.752221
AmazonEC2EBS:VolumeUsage GB-months 28.044400062.804413
AmazonEC2DataProcessing-Bytes
0.001464110.01
AmazonSNSRequests-Tier1 HTTP requests 459.000000000.0
AmazonEC2EBS:VolumeIOUsage I/O requests 3051481.508183820.304344
AmazonEC2SpotUsage:cr1.8xlarge instance-hours 9.000000003.09
AmazonEC2LoadBalancerUsage
1.000000000.03
AmazonEC2SpotUsage:cc2.8xlarge instance-hours 10.000000002.7
AWSDataTransferDataTransfer-Out-Bytes GB 0.000252600.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.002935500.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.001021180.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.000263750.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.147749280.0
AWSDataTransferDataTransfer-Out-Bytes GB 4.3E-70.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.000405910.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.000549686.6e-05
AWSDataTransferDataTransfer-Out-Bytes GB 0.006387850.000763
AWSDataTransferDataTransfer-Out-Bytes GB 0.002222160.000266
AWSDataTransferDataTransfer-Out-Bytes GB 0.000573946.9e-05
AWSDataTransferDataTransfer-Out-Bytes GB 0.321512800.038415
AWSDataTransferDataTransfer-Out-Bytes GB 9.4E-70.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.000883290.000106
AmazonEC2SpotUsage:cr1.8xlarge instance-hours 3.000000001.03
AmazonEC2SpotUsage:cc2.8xlarge instance-hours 1.000000000.27



Total=13.444717
Project=Ray-Cloud-Browser-public-demo
Product CodeUsage TypeUnitsUsage QuantityTotal Cost ($)
AWSDataTransferDataTransfer-Regional-Bytes GB 0.000213352.8e-05
AWSDataTransferDataTransfer-In-Bytes GB 0.000020070.0
AWSDataTransferDataTransfer-In-Bytes GB 0.226134370.0
AmazonEC2SpotUsage:t1.micro instance-hours 338.751381221.017348
AmazonEC2EBS:VolumeUsage GB-months 36.334022093.633367
AmazonEC2EBS:VolumeIOUsage I/O requests 3726756.784679400.371693
AWSDataTransferDataTransfer-Out-Bytes GB 0.000016130.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.560928340.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.000035104.0e-06
AWSDataTransferDataTransfer-Out-Bytes GB 1.220619400.145841



Total=5.168280999999999
Project=ray-in-cloud-cc2.8xlarge
Product CodeUsage TypeUnitsUsage QuantityTotal Cost ($)
AWSDataTransferDataTransfer-In-Bytes GB 0.043436850.0
AmazonEC2SpotUsage:cc2.8xlarge instance-hours 10.000000002.7
AWSDataTransferDataTransfer-Out-Bytes GB 0.002327370.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.005064520.000605



Total=2.7006050000000004
Project=Billing
Product CodeUsage TypeUnitsUsage QuantityTotal Cost ($)
AWSDataTransferDataTransfer-In-Bytes GB 0.003016820.0
AmazonS3Requests-Tier1 HTTP requests 221.573770490.008934
AmazonS3Requests-Tier2 HTTP requests 362.000000000.01
AmazonS3TimedStorage-ByteHrs GB 0.000048820.01
AWSDataTransferDataTransfer-Out-Bytes GB 0.000112060.0
AWSDataTransferDataTransfer-Out-Bytes GB 0.000243862.9e-05



Total=0.028963000000000003
Project=Ray-TestSuite
Product CodeUsage TypeUnitsUsage QuantityTotal Cost ($)
AmazonEC2EBS:VolumeUsage GB-months 0.012308270.001231
AmazonEC2EBS:VolumeIOUsage I/O requests 21529.287104680.002147



Total=0.003378
Project=###############-instance-testing
Product CodeUsage TypeUnitsUsage QuantityTotal Cost ($)
AmazonEC2BoxUsage:hs1.8xlarge instance-hours 1.000000004.6



Total=4.6

People at Amazon.com, Inc. always say that they are obsessed by theirs customers. I am a happy customer of Amazon Web Services, Inc. (AWS), and I can confirm that AWS is really easy for the customer for many reasons, like the Cost Allocation Report.

Cost Allocation Report is really a feature for the customer that allows a better understanding of costs in the cloud. AWS could have charged a lot for that kind of feature -- banks charge their customers a lot for getting account statements from 3 years ago.


p.s.: I have no financial or commercial links with AWS, I am really just one happy AWS customer. I really think that AWS is giving me a great service for the money I give them. It's a win-win situation.

Comments

Beyond said…
Amazon S3 provides a secure, reliable, scalable and inexpensive, "pay only for what you use" storage service on the internet. So I like to use it very well.
Bucket Tagging is very helpful to segment our s3 bill according to the tag added on the buckets.

Regards
Ronak Jain
Bucket Explorer
Bucket Explorer is graphical user interface (GUI) to manage data on Amazon S3 in an efficient and user friendly way.
atg2taa said…
This comment has been removed by the author.
sebhtml said…
atg2taa wrote:
> Hi Sébastien, I'm interested in
> using AWS for genome assemblies.
> I'm looking at the super large
> memory instances but some of the
> new terminology is hard to
> translate into terms I'm used to.
>
> Could you comment a bit on the
> pricing and time costs of doing,
> say a 50Mb genome using AWS?

You can use the spot market -- it is cheap and you can have access to nice machines.

The cr1.8xlarge has 32 vCPUs, 244 GiB memory and even SSDs !

(see http://aws.amazon.com/ec2/instance-types/instance-details/ )

cr1.8xlarge is around 0.35 $ per hour in the spot market.

>
> Is transferring/reading data from
> EBS just as fast as reading from
> local disk?

EBS is tunable. Regular EBS is probably slower, but you can provision a number of operations per second or use EBS-optimized instances which uses a dedicated network channel for EBS.

Since cr1.8xlarge has SSD disks, you can also use these (but they are ephemeral).

But Ray only need disks to load input data and store results. All computation is in-memory.



>
> Thanks,

Popular posts from this blog

Le tissu adipeux brun, la thermogénèse, et les bains froids

My 2022 Calisthenics split routine

Adding ZVOL VIRTIO disks to a guest running on a host with the FreeBSD BHYVE hypervisor