llm
Building an Amazon Bedrock JIRA Agent with Source Code Knowledge Base - Part 2
Using Agents for Amazon Bedrock to interact with JIRA is surprisingly powerful, especially with the ability to add comments and create new tasks. Giving the same agent access to source code using RAG takes it to another level.
Building an Amazon Bedrock JIRA Agent with Source Code Knowledge Base - Part 1
Amazon released the Bedrock service to give access to multiple foundation models via a generalised API. During 2023 they released two features within this service that I wanted to try; native Agents and Retrieval-Augmented Generation with Knowledge bases.
Running an LLM inside an AWS Lambda Function
Large Language Models and Generative AI are big topics in 2023, and whilst using them as a service is great (and sensible), I wondered if it was possible to run a LLM completely within a Lambda Function.