It seems like Generative AI isn’t going anywhere anytime soon. Amazon Web Services, Inc. (AWS) is bringing a whole suite of Generative AI innovations to its services stack. These innovations are set to empower organizations of all sizes, enabling them to develop cutting-edge generative AI applications, enhance employee productivity, and transform their businesses.
Amazon Bedrock Lays the Foundations for Wider Generative AI Applications & Adoption
Leading the charge is Amazon Bedrock, a fully managed service that simplifies the development of generative AI applications. This service offers a variety of foundation models (FMs) from top AI companies, such as AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon. These FMs are versatile and can be applied to a wide range of use cases, from content creation to drug discovery.
One key challenge for businesses interested in adopting generative AI has been finding the right FM for their specific needs. Amazon Bedrock solves this problem by providing access to a diverse selection of FMs via a single application programming interface (API). It also removes the need for specialised hardware deployments. This streamlines the process and eliminates the complexity of managing multiple models and infrastructures.
Unlocking the Power of Generative AI Through Amazon Titan Embeddings
One of the highlights is the general availability of Amazon Titan Embeddings. This family of models, created and pre-trained by AWS, includes a large language model (LLM) known as Amazon Titan Embeddings. This LLM converts text into numerical representations called embeddings, which are crucial for tasks like search, personalization, and retrieval-augmented generation (RAG).
What sets Amazon Titan Embeddings apart is its ability to support more than 25 languages and handle context lengths of up to 8,192 tokens. This makes it highly versatile and suitable for various applications, from processing single words to entire documents. Its output vectors, boasting 1,536 dimensions, ensure both accuracy and low-latency performance.
Integration with Meta’s Llama 2
In the coming weeks, Amazon Bedrock will also offer Llama 2, Meta’s latest large language model. Llama 2 models come with significant enhancements, having been trained on 40% more data and featuring a longer context length of 4,000 tokens. These improvements make Llama 2 ideal for dialogue-based applications, providing fast responses on AWS infrastructure without the need for complex setup and management.
Securely Customise Amazon CodeWhisperer Suggestions with Your Own Codebase
For developers, Amazon CodeWhisperer has been a game-changer. This AI-powered coding companion is trained on billions of lines of Amazon and publicly available code, making it a valuable tool for improving developer productivity. Now, developers can securely customize CodeWhisperer’s code suggestions using their private codebase, addressing the challenges of working with internal, proprietary code.
This customization capability streamlines the process of finding and incorporating internal code into applications. Developers save time, as they no longer need to manually search through extensive internal code repositories. Additionally, administrators can centrally manage customizations, ensuring adherence to quality and security standards.
Generative Business Intelligence with Amazon QuickSight
Amazon QuickSight, a unified business intelligence (BI) service, is introducing Generative BI authoring capabilities. These capabilities go beyond answering structured queries and allow business analysts to easily create and customize visuals using natural-language commands. Analysts can describe their desired outcome, and QuickSight generates compelling visuals, reducing the time spent on manual tasks like data source identification and visualization creation.
Availability
Amazon Bedrock is now generally available, offering businesses the opportunity to leverage the power of generative AI. Amazon Titan Embeddings is also available, while Llama 2 is set to launch in the next few weeks. Customizations for Amazon CodeWhisperer are coming soon while Generative BI authoring capabilities in Amazon QuickSight are already available on AWS.