AWS re:Invent 2018: Key takeaways

AWS re:Invent arguably the biggest and most important event in the cloud computing calendar, has just wrapped up for 2018. The event, which was held in Las Vegas, attracted over 50,000 delegates covering customers, partners and Amazon employees for keynote sessions, workshops and networking.

During the event, AWS made significant announcements across all of its services. We can’t discuss all of them here, but I’ve highlighted top takeaways from the event below:

Machine Learning to remain a key focus area:

AWS has gone back to its core mission to “Put machine learning in the hands of every developer” and made it clear that company’s commitment to Machine Learning is ongoing. AWS boasted the success of SageMaker stating that it is being used by over 10,000 customers despite only existing for one year. The AWS Inferentia Inference Chip signified another step in the direction of Machine Learning commitment for Amazon Web Services.

Swami Sivasubramanian, VP of AI & Machine Learning at AWS, referenced how AWS customer Intuit cut down its development time significantly to build machine learning models like personalization, fraud detection by leveraging the power of SageMaker, resulting in a huge productivity gain. Another key takeaway which came out of his speech was of AWS launching 200 new product features and services under its machine learning practice since this time last year.

Announcement of new Blockchain services:

AWS announced two new Blockchain services at re:Invent last week: Amazon Quantum Ledger Database (QLDB) and Amazon Managed Blockchain. AWS stated that “Amazon Managed Blockchain is a fully managed service that makes it easy to create and manage scalable Blockchain networks using popular open source frameworks Hyperledger Fabric & Ethereum.” They said that Amazon Quantum Ledger Database (QLDB) is a “purpose-built ledger database that provides a complete and verifiable history of application data changes.”

During the event, AWS CEO Andy Jessy discussed the approach AWS took to launch such services. He said that AWS first wanted to genuinely understand the “real customer need” and then deliver the right service for the right set of customers. This announcement signifies how AWS will work to build scalable tools and services related to Blockchain in the hands of developers and customers in the future.

Other Products, Services, Features and Functionality:

Below is a brief summary of some of the most significant announcements AWS made during re:Invent 2018 across different products and services:

AWS Outposts

AWS Outposts is new on-premises hardware AWS has developed alongside VMWare. It is fully managed, maintained, and supported by AWS to deliver access to the latest AWS service. This announcement has certainly come as bliss for customers seeking to run efficiently in hybrid mode.

AWS Inferentia: Machine Learning Inference Chip

AWS Inferentia is a machine learning inference chip designed to deliver high performance at low cost. AWS Inferentia will support the TensorFlow, Apache MXNet, and PyTorch deep learning frameworks, as well as models that use the ONNX format. Also, it will be available on all the EC2 instance types as well as in SageMaker and compatible with Elastic Inference.

Lambda

AWS has always been an ardent preacher of serverless computing and to help popularise this functionality, it launched Lambda in 2015. For those who don’t know, AWS Lambda lets you run code without managing servers. Customer has to pay only for the compute time you consume – there is no charge when your code is not running.

The key takeaway from re:Invent 2018 was that Lambda now supports Ruby runtime and custom runtimes via Lambda Runtime API – added so far are: C++ and Rust, both AWS-provided; and from others there are Erlang, Elexir, COBOL and PHP. This facilitates sharing code or data between Lambda functions removes the need to package anything that could be shared alongside each Lambda function individually.

AWS Firecracker

Firecracker is a new virtualisation technology that allows you to launch lightweight micro-virtual machines (microVMs) in non-virtualised environments in a fraction of a second. With AWS Firecracker, one can take advantage of the security and workload isolation provided by traditional VMs and the resource efficiency that comes along with containers.

AWS SAM

The AWS Serverless Application Model (AWS SAM) now supports defining and deploying nested applications taken from the AWS Serverless Application Repository.

API Gateway

API Gateway now supports websockets, which will make developing single page applications, live content updates and more as serverless a more attractive option.

Lambda functions can now be put behind Application Load Balancers to serve HTTP/HTTPS. So this means we are no longer limited to needing to invoke the functions via API Gateway to expose the function to web traffic.

Amazon EC2 A1 Instances

The new Amazon EC2 A1 instances could deliver significant cost savings for scale-out and Arm-based applications such as web servers, containerised microservices, caching fleets, and distributed data stores that are supported by the extensive Arm ecosystem. A1 instances are the first EC2 instances powered by AWS Graviton Processors that feature 64-bit Arm Neoverse cores and custom silicon designed by AWS. This is relevant for customers who run their workloads on ARM and if the instances are going to have the performance they require as it would mean significant cost reduction.

JOIN OUR COMMUNITY
I agree to have my personal information transfered to MailChimp ( more information )
Join over 3.000 like minded AI enthusiasts who are receiving our weekly newsletters talking about the latest development in AI, Machine Learning and other Automation Technologies
We hate spam. Your email address will not be sold or shared with anyone else.

Leave a Reply