The AWS event is in full swing guys. Amazon has just announced a brand new batch processing service in preview that will let help towards the automation of batch processing job deployment. Welcome AWS Batch!
Batch processing in computing in case you are unaware of it, refers to the processing of previously collected jobs in a single batch. Up till now, the Hadoop open-source big data software was primarily used for batch processing. In fact, AWS and all other public clouds actually relied upon the same to come up with their own knock-off for batch processing and streaming workloads.
However, the company is now striking out on its own and is creating its on system t0 let developers process tasks in batches. Batch also works with containers — something that the traditional virtual machines (VMs) are unable to do even now. Customers input the exact container images that can then be run on top of the AWS EC2 computing infrastructure.
As per a blog post,
AWS Batch allows batch administrators, developers, and users to have access to the power of the cloud without having to provision, manage, monitor, or maintain clusters. There’s nothing to buy and no software to install. AWS Batch takes care of the undifferentiated heavy lifting and allows you to run your container images and applications on a dynamically scaled set of EC2 instances. It is efficient, easy to use, and designed for the cloud.
Amazon apparently got the inspiration to build Batch after it saw that many of its customers were building their own batch processing systems using EC2 instances.
In the past, many AWS customers have built their own batch processing systems using EC2 instances, containers, notifications, CloudWatch monitoring, and so forth. This turned out to be a very common AWS use case and we decided to make it even easier to achieve
Meanwhile, Batch allows Shell scripts and Linux executables are supported at present with Lambda function support coming up in the near future. The service is currently available in preview mode.