diff --git a/06_CodeGeneration/00_code_generatation_w_bedrock.ipynb b/06_CodeGeneration/00_code_generatation_w_bedrock.ipynb new file mode 100644 index 00000000..7f8cd2b5 --- /dev/null +++ b/06_CodeGeneration/00_code_generatation_w_bedrock.ipynb @@ -0,0 +1,995 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "dc40c48b-0c95-4757-a067-563cfccd51a5", + "metadata": { + "tags": [] + }, + "source": [ + "# Invoke Bedrock model for code generation\n", + "\n", + "> *This notebook should work well with the **`Data Science 3.0`** kernel in SageMaker Studio*" + ] + }, + { + "cell_type": "markdown", + "id": "c9a413e2-3c34-4073-9000-d8556537bb6a", + "metadata": {}, + "source": [ + "## Introduction\n", + "\n", + "In this notebook we show you how to use a LLM to generate code based on the text prompt.\n", + "\n", + "We will use Bedrock's Claude v2 using the Boto3 API. \n", + "\n", + "The prompt used in this example is called a zero-shot prompt because we are not providing any examples of text other than the prompt.\n", + "\n", + "**Note:** *This notebook can be run within or outside of AWS environment.*\n", + "\n", + "#### Context\n", + "To demonstrate the code generation capability of Amazon Bedrock, we will explore the use of Boto3 client to communicate with Amazon Bedrock API. We will demonstrate different configurations available as well as how simple input can lead to desired outputs.\n", + "\n", + "#### Pattern\n", + "We will simply provide the Amazon Bedrock API with an input consisting of a task, an instruction and an input for the model under the hood to generate an output without providing any additional example. The purpose here is to demonstrate how the powerful LLMs easily understand the task at hand and generate compelling outputs.\n", + "\n", + "![](./images/bedrock-code-gen.png)\n", + "\n", + "#### Use case\n", + "To demonstrate the generation capability of models in Amazon Bedrock, let's take the use case of code generation.\n", + "\n", + "#### Persona\n", + "\n", + "You are Moe, a Data Analyst, at AnyCompany. The company wants to understand its sales performance for different products for different products over the past year. You have been provided a dataset named sales.csv. The dataset contains the following columns:\n", + "\n", + "- Date (YYYY-MM-DD) format\n", + "- Product_ID (unique identifer for each product)\n", + "- Price (price at which each product was sold)\n", + "\n", + "#### Implementation\n", + "To fulfill this use case, in this notebook we will show how to generate code for a given prompt.We will use the Anthropic Claude v2 using the Amazon Bedrock API with Boto3 client. " + ] + }, + { + "cell_type": "markdown", + "id": "64baae27-2660-4a1e-b2e5-3de49d069362", + "metadata": {}, + "source": [ + "## Setup\n", + "\n", + "Before running the rest of this notebook, you'll need to run the cells below to (ensure necessary libraries are installed and) connect to Bedrock.\n", + "\n", + "For more details on how the setup works and ⚠️ **whether you might need to make any changes**, refer to the [Bedrock boto3 setup notebook](../00_Intro/bedrock_boto3_setup.ipynb) notebook." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "38b791ad-e6c5-4da5-96af-5c356a36e19d", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "# Make sure you ran `download-dependencies.sh` from the root of the repository first!\n", + "%pip install --no-build-isolation --force-reinstall \\\n", + " ../dependencies/awscli-*-py3-none-any.whl \\\n", + " ../dependencies/boto3-*-py3-none-any.whl \\\n", + " ../dependencies/botocore-*-py3-none-any.whl\n", + "\n", + "%pip install --quiet langchain==0.0.249" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7ea26558", + "metadata": {}, + "outputs": [], + "source": [ + "# Optional - To execute the generated code in this notebook\n", + "%pip install matplotlib" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "776fd083", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "import json\n", + "import os\n", + "import sys\n", + "\n", + "import boto3\n", + "\n", + "module_path = \"..\"\n", + "sys.path.append(os.path.abspath(module_path))\n", + "from utils import bedrock, print_ww\n", + "\n", + "\n", + "# ---- ⚠️ Un-comment and edit the below lines as needed for your AWS setup ⚠️ ----\n", + "\n", + "os.environ[\"AWS_DEFAULT_REGION\"] = \"us-east-1\" # E.g. \"us-east-1\"\n", + "os.environ[\"AWS_PROFILE\"] = \"fine-tuning-bedrock\"\n", + "# os.environ[\"BEDROCK_ASSUME_ROLE\"] = \"\" # E.g. \"arn:aws:...\"\n", + "# os.environ[\"BEDROCK_ENDPOINT_URL\"] = \"\" # E.g. \"https://...\"\n", + "\n", + "\n", + "boto3_bedrock = bedrock.get_bedrock_client(\n", + " assumed_role=os.environ.get(\"BEDROCK_ASSUME_ROLE\", None),\n", + " endpoint_url=os.environ.get(\"BEDROCK_ENDPOINT_URL\", None),\n", + " region=os.environ.get(\"AWS_DEFAULT_REGION\", None),\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "4f634211-3de1-4390-8c3f-367af5554c39", + "metadata": {}, + "source": [ + "## Code Generation\n", + "\n", + "Following on the use case explained above, let's prepare an input for the Amazon Bedrock service to generate python program for our use-case." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "id": "45ee2bae-6415-4dba-af98-a19028305c98", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "# Create the prompt\n", + "# Analyzing sales with a Python Program\n", + "\n", + "prompt_data = \"\"\"\n", + "Command: Human: You have a CSV, sales.csv, with columns:\n", + "- date (YYYY-MM-DD)\n", + "- product_id\n", + "- price\n", + "- units_sold\n", + "\n", + "Wrte a python program to load the data and determine \n", + "\n", + "- Total revenue for the year\n", + "- The product with the highest revenue\n", + "- The date with the highest revenue\n", + "- Visualize monthly sales using a bar chart\n", + "\n", + "Assistant:\n", + "\"\"\"" + ] + }, + { + "cell_type": "markdown", + "id": "cc9784e5-5e9d-472d-8ef1-34108ee4968b", + "metadata": {}, + "source": [ + "Let's start by using the Anthropic Claude V2 model." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "id": "8af670eb-ad02-40df-a19c-3ed835fac8d9", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "# Claude - Body Syntex\n", + "body = json.dumps({\n", + " \"prompt\": prompt_data,\n", + " \"max_tokens_to_sample\":4096,\n", + " \"temperature\":0.5,\n", + " \"top_k\":250,\n", + " \"top_p\":0.5,\n", + " \"stop_sequences\": [\"\\n\\nHuman:\"]\n", + " }) " + ] + }, + { + "cell_type": "markdown", + "id": "c4ca6751", + "metadata": {}, + "source": [ + "The Amazon Bedrock API provides you with an API `invoke_model` which accepts the following:\n", + "- `modelId`: This is the model ARN for the various foundation models available under Amazon Bedrock\n", + "- `accept`: The type of input request\n", + "- `contentType`: The content type of the output\n", + "- `body`: A json string consisting of the prompt and the configurations\n", + "\n", + "Available text generation models under Amazon Bedrock have the following IDs:\n", + "- `amazon.titan-tg1-large`\n", + "- `amazon.titan-e1t-medium`\n", + "- `ai21.j2-grande-instruct`\n", + "- `ai21.j2-jumbo-instruct`\n", + "- `ai21.j2-mid`\n", + "- `ai21.j2-ultra`\n", + "- `anthropic.claude-instant-v1`\n", + "- `anthropic.claude-v1`\n", + "- `anthropic.claude-v2`" + ] + }, + { + "cell_type": "markdown", + "id": "088cf6bf-dd73-4710-a0cc-6c11d220c431", + "metadata": {}, + "source": [ + "#### Invoke the Anthropic Claude v2 model" + ] + }, + { + "cell_type": "markdown", + "id": "379498f2", + "metadata": {}, + "source": [ + "First, we explore how the model generates an output based on the prompt created earlier.\n", + "\n", + "##### Complete Output Generation" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "id": "016a118a", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Here is a Python program to analyze the sales CSV file as described:\n", + "\n", + "```python\n", + "import csv\n", + "from collections import defaultdict\n", + "import matplotlib.pyplot as plt\n", + "\n", + "revenue_by_month = defaultdict(int)\n", + "\n", + "with open('sales.csv', 'r') as f:\n", + " reader = csv.DictReader(f)\n", + " total_revenue = 0\n", + " max_revenue_product = None\n", + " max_revenue = 0\n", + " max_revenue_date = None\n", + "\n", + " for row in reader:\n", + " revenue = float(row['price']) * int(row['units_sold'])\n", + " total_revenue += revenue\n", + "\n", + " date = row['date']\n", + " month = date.split('-')[1]\n", + " revenue_by_month[month] += revenue\n", + "\n", + " if revenue > max_revenue:\n", + " max_revenue = revenue\n", + " max_revenue_product = row['product_id']\n", + " max_revenue_date = date\n", + "\n", + "print('Total revenue:', total_revenue)\n", + "print('Product with max revenue:', max_revenue_product)\n", + "print('Date with max revenue:', max_revenue_date)\n", + "\n", + "plt.bar(revenue_by_month.keys(), revenue_by_month.values())\n", + "plt.xlabel('Month')\n", + "plt.ylabel('Revenue')\n", + "plt.title('Revenue by Month')\n", + "plt.show()\n", + "```\n", + "\n", + "This loads the CSV data, calculates the total revenue, finds the product and date with max revenue,\n", + "and visualizes the revenue per month in a bar chart. The defaultdict is used to easily accumulate\n", + "values by month.\n" + ] + } + ], + "source": [ + "modelId = 'anthropic.claude-v2' # change this to use a different version from the model provider\n", + "accept = 'application/json'\n", + "contentType = 'application/json'\n", + "\n", + "response = boto3_bedrock.invoke_model(body=body, modelId=modelId, accept=accept, contentType=contentType)\n", + "response_body = json.loads(response.get('body').read())\n", + "\n", + "print_ww(response_body.get('completion'))" + ] + }, + { + "cell_type": "markdown", + "id": "ddddd1ec", + "metadata": {}, + "source": [ + "#### (Optional) Execute the Bedrock generated code for validation. Go to text editor to copy the generated code as printed output can be trucncated. Replce the code in below cell." + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "id": "395fad3b", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Total revenue: 35490.0\n", + "Product with max revenue: P003\n", + "Date with max revenue: 2023-04-23\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAk0AAAHHCAYAAACiOWx7AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAAA/30lEQVR4nO3df3zN9f//8fuZ2Y+wzbDNNCwKiyjeWAm97WvYu1opv0Zkkd5bPyg/9q786gdWy4+Spd6ZincovEV+LKq9Mb+GRCi9J8K2vGc7ITPb6/uHz16XTvPjZY1zNrfr5XIuF+f1fJznebyeabt7vV7ndWyGYRgCAADAJbk5uwEAAICKgNAEAABgAaEJAADAAkITAACABYQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAWde7cWc2bN3d2Gy5h0KBBql69urPbAK4pQhNwHUpJSZHNZjMf7u7uqlevngYNGqQjR444u73rXsl/l8cee+yC488//7xZc/z48avWx+nTpzV+/Hh99dVXV+09gIrE3dkNAHCeiRMnKjQ0VGfOnNGmTZuUkpKi9evXa/fu3fLy8nJ2e9c1Ly8vffrpp3r77bfl4eHhMPavf/1LXl5eOnPmzFXt4fTp05owYYKk80fZgOsdR5qA61j37t3Vv39/PfbYY3rvvff03HPP6ccff9SyZcuc3dp1r1u3brLb7Vq5cqXD9o0bNyozM1NRUVFO6gy4fhGaAJjuvvtuSdKPP/7osH3fvn166KGH5O/vLy8vL7Vp08YhWG3btk02m01z584tNefq1atls9m0fPlyc9uRI0c0ePBgBQYGytPTU7feeqvef/99h9d99dVXstlsWrhwoV555RXdeOON8vLyUpcuXXTgwAGH2oYNG2rQoEGl3rtz586ljpAUFBRo3Lhxaty4sTw9PRUSEqJRo0apoKDA0hpJUkZGhu688055e3srNDRUycnJ5tjJkydVrVo1Pf3006Ve9/PPP6tKlSqaNGnSZd+jXr166tixo+bPn++wfd68eWrRosVFr61atGiRWrduLW9vb9WuXVv9+/cvdcq15HqkI0eOKDo6WtWrV1edOnX03HPPqaioSJJ08OBB1alTR5I0YcIE83Tg+PHjHea61BxAZUNoAmA6ePCgJKlmzZrmtj179qh9+/bau3evxowZo6SkJFWrVk3R0dFasmSJJKlNmza66aabtHDhwlJzLliwQDVr1lRkZKQkKTs7W+3bt9cXX3yh+Ph4TZ8+XY0bN1ZsbKymTZtW6vWTJ0/WkiVL9NxzzykhIUGbNm1STExMmfavuLhY9913n15//XXde++9evPNNxUdHa2pU6eqd+/eluY4ceKEevToodatWysxMVE33nijnnjiCTP0Va9eXQ888IAWLFhQKjz861//kmEYlvvv16+fPvvsM508eVKSdO7cOS1atEj9+vW7YH1KSop69eplBrMhQ4Zo8eLF6tChg/Ly8hxqi4qKFBkZqVq1aun1119Xp06dlJSUpNmzZ0uS6tSpo1mzZkmSHnjgAX344Yf68MMP9eCDD1qeA6h0DADXnTlz5hiSjC+++ML45ZdfjMOHDxuffPKJUadOHcPT09M4fPiwWdulSxejRYsWxpkzZ8xtxcXFxp133mncfPPN5raEhASjatWqRm5urrmtoKDA8PPzMwYPHmxui42NNerWrWscP37coac+ffoYvr6+xunTpw3DMIwvv/zSkGQ0a9bMKCgoMOumT59uSDK+/fZbc1uDBg2MgQMHltrPTp06GZ06dTKff/jhh4abm5vxn//8x6EuOTnZkGRs2LDhkuvWqVMnQ5KRlJTksI+tWrUyAgICjLNnzxqGYRirV682JBkrV650eP1tt93m0M/FSDLi4uKM3Nxcw8PDw/jwww8NwzCMFStWGDabzTh48KAxbtw4Q5Lxyy+/GIZhGGfPnjUCAgKM5s2bG7/99ps51/Llyw1JxtixY81tAwcONCQZEydOdHjf22+/3WjdurX5/JdffjEkGePGjSvVo9U5gMqEI03AdSwiIkJ16tRRSEiIHnroIVWrVk3Lli3TjTfeKEnKzc3VunXr1KtXL/366686fvy4jh8/rv/973+KjIzUDz/8YJ766d27twoLC7V48WJz/jVr1igvL888imMYhj799FPde++9MgzDnO/48eOKjIxUfn6+tm/f7tDjo48+6nAhdMkpxP/+979XvL+LFi1Ss2bN1LRpU4f3/utf/ypJ+vLLLy87h7u7ux5//HHzuYeHhx5//HHl5OQoIyPDXNfg4GDNmzfPrNu9e7d27dql/v37W+63Zs2a6tatm/71r39JkubPn68777xTDRo0KFW7bds25eTk6O9//7vDRfxRUVFq2rSpVqxYUeo1w4YNc3h+9913X/G6lsccQEXBp+eA69jMmTN1yy23KD8/X++//77S0tLk6elpjh84cECGYejFF1/Uiy++eME5cnJyVK9ePbVs2VJNmzbVggULFBsbK+n8qbnatWuboeSXX35RXl6eZs+efdFTODk5OQ7P69ev7/C85NThiRMnrnh/f/jhB+3du9e8Vudy730hwcHBqlatmsO2W265RdL505vt27eXm5ubYmJiNGvWLJ0+fVo33HCD5s2bJy8vLz388MNX1HO/fv00YMAAHTp0SEuXLlViYuIF63766SdJUpMmTUqNNW3aVOvXr3fY5uXlVWodataseUXrWh5zABUJoQm4jrVt21Zt2rSRJEVHR6tDhw7q16+f9u/fr+rVq6u4uFiS9Nxzz5nXJP1R48aNzT/37t1br7zyio4fP64aNWpo2bJl6tu3r9zdz/+oKZmvf//+Gjhw4AXnu+222xyeV6lS5YJ1hmGYf7bZbBesKSoqcnh9cXGxWrRooTfeeOOC9SEhIRfcXhaPPPKIXnvtNS1dulR9+/bV/Pnz9be//U2+vr5XNM99990nT09PDRw4UAUFBerVq1e59Hexdb3WcwAVCaEJgCSZFw/fc889euuttzRmzBjddNNNkqSqVasqIiLisnP07t1bEyZM0KeffqrAwEDZ7Xb16dPHHK9Tp45q1KihoqIiS/NZVbNmzVIXOkvnj76U7IMkNWrUSN988426dOly0aB1OUePHtWpU6ccjjZ9//33ks5/iq9E8+bNdfvtt2vevHm68cYbdejQIb355ptX/H7e3t6Kjo7WRx99pO7du6t27doXrCs5Zbd//37zyF6J/fv3X/CU3uWUdY2AyoprmgCYOnfurLZt22ratGk6c+aMAgIC1LlzZ73zzjs6duxYqfpffvnF4XmzZs3UokULLViwQAsWLFDdunXVsWNHc7xKlSrq2bOnPv30U+3evfuy81nVqFEjbdq0SWfPnjW3LV++XIcPH3ao69Wrl44cOaJ333231By//fabTp06ddn3OnfunN555x3z+dmzZ/XOO++oTp06at26tUPtgAEDtGbNGk2bNk21atVS9+7dr3TXJJ0/0jdu3LiLniKVzn+CMSAgQMnJyQ63T1i5cqX27t1bpvs63XDDDZJ0wUAKXI840gTAwciRI/Xwww8rJSVFw4YN08yZM9WhQwe1aNFCQ4YM0U033aTs7Gylp6fr559/1jfffOPw+t69e2vs2LHy8vJSbGys3Nwc/202efJkffnll2rXrp2GDBmisLAw5ebmavv27friiy+Um5t7xT0/9thj+uSTT9StWzf16tVLP/74oz766CM1atTIoW7AgAFauHChhg0bpi+//FJ33XWXioqKtG/fPi1cuFCrV682T1deTHBwsKZMmaKDBw/qlltu0YIFC7Rz507Nnj1bVatWdajt16+fRo0apSVLluiJJ54oNW5Vy5Yt1bJly0vWVK1aVVOmTNGjjz6qTp06qW/fvsrOztb06dPVsGFDDR8+/Irf19vbW2FhYVqwYIFuueUW+fv7q3nz5nz/Hq5bHGkC4ODBBx9Uo0aN9Prrr6uoqEhhYWHatm2boqKilJKSori4OCUnJ8vNzU1jx44t9frevXuruLhYp0+fvuC9jwIDA7VlyxY9+uijWrx4sXmvptzcXE2ZMqVMPUdGRiopKUnff/+9nnnmGaWnp2v58uXmpwBLuLm5aenSpZo8ebK+/fZbPffcc5owYYK2bt2qp59+2ryg+1Jq1qypzz//XNu2bdPIkSN1+PBhvfXWWxoyZMgF97Vr166Szge2q23QoEFasGCBzp49q9GjR+udd97RAw88oPXr18vPz69Mc7733nuqV6+ehg8frr59++qTTz4p36aBCsRm/P5qSgBAuXrggQf07bfflrqLOYCKhyNNAHCVHDt2TCtWrLgmR5kAXH1c0wQA5SwzM1MbNmzQe++9p6pVqzrcDBNAxcWRJgAoZ19//bUGDBigzMxMzZ07V0FBQc5uCUA54JomAAAACzjSBAAAYAGhCQAAwAIuBC8nxcXFOnr0qGrUqMFXDwAAUEEYhqFff/1VwcHBpW7G+0eEpnJy9OjRcv2yTwAAcO0cPny41A1x/4jQVE5q1Kgh6fyi+/j4OLkbAABghd1uV0hIiPl7/FIITeWk5JScj48PoQkAgArGyqU1Tr0QPC0tTffee6+Cg4Nls9m0dOnSi9YOGzZMNptN06ZNc9iem5urmJgY+fj4yM/PT7GxsTp58qRDza5du3T33XfLy8tLISEhSkxMLDX/okWL1LRpU3l5ealFixb6/PPPy2MXAQBAJeHU0HTq1Cm1bNlSM2fOvGTdkiVLtGnTJgUHB5cai4mJ0Z49e5Samqrly5crLS1NQ4cONcftdru6du2qBg0aKCMjQ6+99prGjx+v2bNnmzUbN25U3759FRsbqx07dig6OlrR0dHavXt3+e0sAACo2AwXIclYsmRJqe0///yzUa9ePWP37t1GgwYNjKlTp5pj3333nSHJ2Lp1q7lt5cqVhs1mM44cOWIYhmG8/fbbRs2aNY2CggKzZvTo0UaTJk3M57169TKioqIc3rddu3bG448/brn//Px8Q5KRn59v+TUAAMC5ruT3t0vfp6m4uFgDBgzQyJEjdeutt5YaT09Pl5+fn9q0aWNui4iIkJubmzZv3mzWdOzYUR4eHmZNZGSk9u/frxMnTpg1ERERDnNHRkYqPT39or0VFBTIbrc7PAAAQOXl0qFpypQpcnd311NPPXXB8aysLAUEBDhsc3d3l7+/v7KyssyawMBAh5qS55erKRm/kEmTJsnX19d8cLsBAAAqN5cNTRkZGZo+fbpSUlJc8maRCQkJys/PNx+HDx92dksAAOAqctnQ9J///Ec5OTmqX7++3N3d5e7urp9++knPPvusGjZsKEkKCgpSTk6Ow+vOnTun3Nxc81vFg4KClJ2d7VBT8vxyNZf6ZnJPT0/z9gLcZgAAgMrPZUPTgAEDtGvXLu3cudN8BAcHa+TIkVq9erUkKTw8XHl5ecrIyDBft27dOhUXF6tdu3ZmTVpamgoLC82a1NRUNWnSRDVr1jRr1q5d6/D+qampCg8Pv9q7CQAAKgin3tzy5MmTOnDggPk8MzNTO3fulL+/v+rXr69atWo51FetWlVBQUFq0qSJJKlZs2bq1q2bhgwZouTkZBUWFio+Pl59+vQxb0/Qr18/TZgwQbGxsRo9erR2796t6dOna+rUqea8Tz/9tDp16qSkpCRFRUXp448/1rZt2xxuSwAAAK5z1+DTfBf15ZdfGpJKPQYOHHjB+j/ecsAwDON///uf0bdvX6N69eqGj4+P8eijjxq//vqrQ80333xjdOjQwfD09DTq1atnTJ48udTcCxcuNG655RbDw8PDuPXWW40VK1Zc0b5wywEAACqeK/n9bTMMw3BiZqs07Ha7fH19lZ+fz/VNAABUEFfy+9tlr2kCAABwJYQmAAAACwhNAAAAFhCaAAAALHDqLQdgXcMxK5zdQoVxcHKUs1sAAFRCHGkCAACwgNAEAABgAaEJAADAAkITAACABYQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAAYAGhCQAAwAJCEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAAMACQhMAAIAFhCYAAAALCE0AAAAWEJoAAAAsIDQBAABYQGgCAACwgNAEAABgAaEJAADAAkITAACABYQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAAYAGhCQAAwAJCEwAAgAVODU1paWm69957FRwcLJvNpqVLl5pjhYWFGj16tFq0aKFq1aopODhYjzzyiI4ePeowR25urmJiYuTj4yM/Pz/Fxsbq5MmTDjW7du3S3XffLS8vL4WEhCgxMbFUL4sWLVLTpk3l5eWlFi1a6PPPP78q+wwAAComp4amU6dOqWXLlpo5c2apsdOnT2v79u168cUXtX37di1evFj79+/Xfffd51AXExOjPXv2KDU1VcuXL1daWpqGDh1qjtvtdnXt2lUNGjRQRkaGXnvtNY0fP16zZ882azZu3Ki+ffsqNjZWO3bsUHR0tKKjo7V79+6rt/MAAKBCsRmGYTi7CUmy2WxasmSJoqOjL1qzdetWtW3bVj/99JPq16+vvXv3KiwsTFu3blWbNm0kSatWrVKPHj30888/Kzg4WLNmzdLzzz+vrKwseXh4SJLGjBmjpUuXat++fZKk3r1769SpU1q+fLn5Xu3bt1erVq2UnJxsqX+73S5fX1/l5+fLx8enjKtwcQ3HrCj3OSurg5OjnN0CAKCCuJLf3xXqmqb8/HzZbDb5+flJktLT0+Xn52cGJkmKiIiQm5ubNm/ebNZ07NjRDEySFBkZqf379+vEiRNmTUREhMN7RUZGKj09/aK9FBQUyG63OzwAAEDlVWFC05kzZzR69Gj17dvXTIJZWVkKCAhwqHN3d5e/v7+ysrLMmsDAQIeakueXqykZv5BJkybJ19fXfISEhPy5HQQAAC6tQoSmwsJC9erVS4ZhaNasWc5uR5KUkJCg/Px883H48GFntwQAAK4id2c3cDklgemnn37SunXrHM43BgUFKScnx6H+3Llzys3NVVBQkFmTnZ3tUFPy/HI1JeMX4unpKU9Pz7LvGAAAqFBc+khTSWD64Ycf9MUXX6hWrVoO4+Hh4crLy1NGRoa5bd26dSouLla7du3MmrS0NBUWFpo1qampatKkiWrWrGnWrF271mHu1NRUhYeHX61dAwAAFYxTQ9PJkye1c+dO7dy5U5KUmZmpnTt36tChQyosLNRDDz2kbdu2ad68eSoqKlJWVpaysrJ09uxZSVKzZs3UrVs3DRkyRFu2bNGGDRsUHx+vPn36KDg4WJLUr18/eXh4KDY2Vnv27NGCBQs0ffp0jRgxwuzj6aef1qpVq5SUlKR9+/Zp/Pjx2rZtm+Lj46/5mgAAANfk1FsOfPXVV7rnnntKbR84cKDGjx+v0NDQC77uyy+/VOfOnSWdv7llfHy8PvvsM7m5ualnz56aMWOGqlevbtbv2rVLcXFx2rp1q2rXrq0nn3xSo0ePdphz0aJFeuGFF3Tw4EHdfPPNSkxMVI8ePSzvC7cccB3ccgAAYNWV/P52mfs0VXSEJtdBaAIAWFVp79MEAADgLIQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAAYAGhCQAAwAJCEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAAMACQhMAAIAFhCYAAAALCE0AAAAWEJoAAAAsIDQBAABYQGgCAACwgNAEAABgAaEJAADAAkITAACABYQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAAYAGhCQAAwAJCEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAAMACQhMAAIAFTg1NaWlpuvfeexUcHCybzaalS5c6jBuGobFjx6pu3bry9vZWRESEfvjhB4ea3NxcxcTEyMfHR35+foqNjdXJkycdanbt2qW7775bXl5eCgkJUWJiYqleFi1apKZNm8rLy0stWrTQ559/Xu77CwAAKi6nhqZTp06pZcuWmjlz5gXHExMTNWPGDCUnJ2vz5s2qVq2aIiMjdebMGbMmJiZGe/bsUWpqqpYvX660tDQNHTrUHLfb7eratasaNGigjIwMvfbaaxo/frxmz55t1mzcuFF9+/ZVbGysduzYoejoaEVHR2v37t1Xb+cBAECFYjMMw3B2E5Jks9m0ZMkSRUdHSzp/lCk4OFjPPvusnnvuOUlSfn6+AgMDlZKSoj59+mjv3r0KCwvT1q1b1aZNG0nSqlWr1KNHD/38888KDg7WrFmz9PzzzysrK0seHh6SpDFjxmjp0qXat2+fJKl37946deqUli9fbvbTvn17tWrVSsnJyZb6t9vt8vX1VX5+vnx8fMprWUwNx6wo9zkrq4OTo5zdAgCggriS398ue01TZmamsrKyFBERYW7z9fVVu3btlJ6eLklKT0+Xn5+fGZgkKSIiQm5ubtq8ebNZ07FjRzMwSVJkZKT279+vEydOmDW/f5+SmpL3uZCCggLZ7XaHBwAAqLxcNjRlZWVJkgIDAx22BwYGmmNZWVkKCAhwGHd3d5e/v79DzYXm+P17XKymZPxCJk2aJF9fX/MREhJypbsIAAAqEJcNTa4uISFB+fn55uPw4cPObgkAAFxFLhuagoKCJEnZ2dkO27Ozs82xoKAg5eTkOIyfO3dOubm5DjUXmuP373GxmpLxC/H09JSPj4/DAwAAVF4uG5pCQ0MVFBSktWvXmtvsdrs2b96s8PBwSVJ4eLjy8vKUkZFh1qxbt07FxcVq166dWZOWlqbCwkKzJjU1VU2aNFHNmjXNmt+/T0lNyfsAAAA4NTSdPHlSO3fu1M6dOyWdv/h7586dOnTokGw2m5555hm9/PLLWrZsmb799ls98sgjCg4ONj9h16xZM3Xr1k1DhgzRli1btGHDBsXHx6tPnz4KDg6WJPXr108eHh6KjY3Vnj17tGDBAk2fPl0jRoww+3j66ae1atUqJSUlad++fRo/fry2bdum+Pj4a70kAADARbk78823bdume+65x3xeEmQGDhyolJQUjRo1SqdOndLQoUOVl5enDh06aNWqVfLy8jJfM2/ePMXHx6tLly5yc3NTz549NWPGDHPc19dXa9asUVxcnFq3bq3atWtr7NixDvdyuvPOOzV//ny98MIL+sc//qGbb75ZS5cuVfPmza/BKgAAgIrAZe7TVNFxnybXwX2aAABWVYr7NAEAALgSQhMAAIAFhCYAAAALCE0AAAAWEJoAAAAsIDQBAABYQGgCAACwgNAEAABgAaEJAADAAkITAACABYQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAAYIG7sxsAALiGhmNWOLuFCuPg5ChntwAn4EgTAACABYQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAAYAGhCQAAwAJCEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAAMACQhMAAIAFhCYAAAALCE0AAAAWEJoAAAAsIDQBAABYQGgCAACwgNAEAABgQZlDU15ent577z0lJCQoNzdXkrR9+3YdOXKk3JorKirSiy++qNDQUHl7e6tRo0Z66aWXZBiGWWMYhsaOHau6devK29tbERER+uGHHxzmyc3NVUxMjHx8fOTn56fY2FidPHnSoWbXrl26++675eXlpZCQECUmJpbbfgAAgIqvTKFp165duuWWWzRlyhS9/vrrysvLkyQtXrxYCQkJ5dbclClTNGvWLL311lvau3evpkyZosTERL355ptmTWJiombMmKHk5GRt3rxZ1apVU2RkpM6cOWPWxMTEaM+ePUpNTdXy5cuVlpamoUOHmuN2u11du3ZVgwYNlJGRoddee03jx4/X7Nmzy21fAABAxVam0DRixAgNGjRIP/zwg7y8vMztPXr0UFpaWrk1t3HjRt1///2KiopSw4YN9dBDD6lr167asmWLpPNHmaZNm6YXXnhB999/v2677TZ98MEHOnr0qJYuXSpJ2rt3r1atWqX33ntP7dq1U4cOHfTmm2/q448/1tGjRyVJ8+bN09mzZ/X+++/r1ltvVZ8+ffTUU0/pjTfeKLd9AQAAFVuZQtPWrVv1+OOPl9per149ZWVl/emmStx5551au3atvv/+e0nSN998o/Xr16t79+6SpMzMTGVlZSkiIsJ8ja+vr9q1a6f09HRJUnp6uvz8/NSmTRuzJiIiQm5ubtq8ebNZ07FjR3l4eJg1kZGR2r9/v06cOFFu+wMAACou97K8yNPTU3a7vdT277//XnXq1PnTTZUYM2aM7Ha7mjZtqipVqqioqEivvPKKYmJiJMkMaIGBgQ6vCwwMNMeysrIUEBDgMO7u7i5/f3+HmtDQ0FJzlIzVrFmzVG8FBQUqKCgwn19oPQAAQOVRpiNN9913nyZOnKjCwkJJks1m06FDhzR69Gj17Nmz3JpbuHCh5s2bp/nz52v79u2aO3euXn/9dc2dO7fc3qOsJk2aJF9fX/MREhLi7JYAAMBVVKbQlJSUpJMnTyogIEC//fabOnXqpMaNG6tGjRp65ZVXyq25kSNHasyYMerTp49atGihAQMGaPjw4Zo0aZIkKSgoSJKUnZ3t8Lrs7GxzLCgoSDk5OQ7j586dU25urkPNheb4/Xv8UUJCgvLz883H4cOH/+TeAgAAV1am03O+vr5KTU3V+vXrtWvXLp08eVJ33HGHw7VF5eH06dNyc3PMdVWqVFFxcbEkKTQ0VEFBQVq7dq1atWol6fxpss2bN+uJJ56QJIWHhysvL08ZGRlq3bq1JGndunUqLi5Wu3btzJrnn39ehYWFqlq1qiQpNTVVTZo0ueCpOen8KUpPT89y3V8AAOC6yhSaSnTo0EEdOnQor15Kuffee/XKK6+ofv36uvXWW7Vjxw698cYbGjx4sKTzpwWfeeYZvfzyy7r55psVGhqqF198UcHBwYqOjpYkNWvWTN26ddOQIUOUnJyswsJCxcfHq0+fPgoODpYk9evXTxMmTFBsbKxGjx6t3bt3a/r06Zo6depV2zcAAFCxlCk0TZw48ZLjY8eOLVMzf/Tmm2/qxRdf1N///nfl5OQoODhYjz/+uMP8o0aN0qlTpzR06FDl5eWpQ4cOWrVqlcOtEObNm6f4+Hh16dJFbm5u6tmzp2bMmGGO+/r6as2aNYqLi1Pr1q1Vu3ZtjR071uFeTgAA4PpmM35/e22Lbr/9dofnhYWFyszMlLu7uxo1aqTt27eXW4MVhd1ul6+vr/Lz8+Xj41Pu8zccs6Lc56ysDk6OcnYLQIXEzxnr+DlTeVzJ7+8yHWnasWPHBd900KBBeuCBB8oyJQAAgEsrty/s9fHx0YQJE/Tiiy+W15QAAAAuo9xCkyTz4/cAAACVTZlOz/3+Imrp/HfAHTt2TB9++KH5FScAAACVSZlC0x8/iu/m5qY6depo4MCBSkhIKJfGAAAAXEmZQlNmZmZ59wEAAODSyvWaJgAAgMqqTEeaTp06pcmTJ2vt2rXKyckxv9akxH//+99yaQ4AAMBVlCk0PfbYY/r66681YMAA1a1bVzabrbz7AgAAcCllCk0rV67UihUrdNddd5V3PwAAAC6pTNc01axZU/7+/uXdCwAAgMsqU2h66aWXNHbsWJ0+fbq8+wEAAHBJZTo9l5SUpB9//FGBgYFq2LChqlat6jB+PX5hLwAAqNzKFJqio6PLuQ0AAADXVqbQNG7cuPLuAwAAwKWV+eaWeXl5eu+995SQkKDc3FxJ50/LHTlypNyaAwAAcBVlOtK0a9cuRUREyNfXVwcPHtSQIUPk7++vxYsX69ChQ/rggw/Ku08AAACnKtORphEjRmjQoEH64Ycf5OXlZW7v0aOH0tLSyq05AAAAV1Gm0LR161Y9/vjjpbbXq1dPWVlZf7opAAAAV1Om0OTp6Sm73V5q+/fff686der86aYAAABcTZlC03333aeJEyeqsLBQkmSz2XTo0CGNHj1aPXv2LNcGAQAAXEGZQlNSUpJOnjypgIAA/fbbb+rUqZMaN26sGjVq6JVXXinvHgEAAJyuTJ+e8/X1VWpqqtavX69du3bp5MmTuuOOOxQREVHe/QEAALiEMoWmw4cPKyQkRB06dFCHDh3KuycAAACXU6bTcw0bNlSnTp307rvv6sSJE+XdEwAAgMspU2jatm2b2rZtq4kTJ6pu3bqKjo7WJ598ooKCgvLuDwAAwCWUKTTdfvvteu2113To0CGtXLlSderU0dChQxUYGKjBgweXd48AAABOV+bvnpPO32rgnnvu0bvvvqsvvvhCoaGhmjt3bnn1BgAA4DL+VGj6+eeflZiYqFatWqlt27aqXr26Zs6cWV69AQAAuIwyfXrunXfe0fz587VhwwY1bdpUMTEx+ve//60GDRqUd38AAAAuoUyh6eWXX1bfvn01Y8YMtWzZsrx7AgAAcDllCk2HDh2SzWYr714AAABcVpmuabLZbPrPf/6j/v37Kzw8XEeOHJEkffjhh1q/fn25NggAAOAKyhSaPv30U0VGRsrb21s7duww78+Un5+vV199tVwbBAAAcAVlCk0vv/yykpOT9e6776pq1arm9rvuukvbt28vt+YAAABcRZlC0/79+9WxY8dS2319fZWXl/dnewIAAHA5ZQpNQUFBOnDgQKnt69ev10033fSnmwIAAHA1ZQpNQ4YM0dNPP63NmzfLZrPp6NGjmjdvnp599lk98cQT5d0jAACA05XplgNjxoxRcXGxunTpotOnT6tjx47y9PTUyJEj9dhjj5V3jwAAAE5X5lsOPP/888rNzdXu3bu1adMm/fLLL/L19VVoaGh59wgAAOB0VxSaCgoKlJCQoDZt2uiuu+7S559/rrCwMO3Zs0dNmjTR9OnTNXz48KvVKwAAgNNcUWgaO3asZs2apYYNGyozM1MPP/ywhg4dqqlTpyopKUmZmZkaPXp0uTZ45MgR9e/fX7Vq1ZK3t7datGihbdu2meOGYWjs2LGqW7euvL29FRERoR9++MFhjtzcXMXExMjHx0d+fn6KjY3VyZMnHWp27dqlu+++W15eXgoJCVFiYmK57gcAAKjYrig0LVq0SB988IE++eQTrVmzRkVFRTp37py++eYb9enTR1WqVCnX5k6cOKG77rpLVatW1cqVK/Xdd98pKSlJNWvWNGsSExM1Y8YMJScna/PmzapWrZoiIyN15swZsyYmJkZ79uxRamqqli9frrS0NA0dOtQct9vt6tq1qxo0aKCMjAy99tprGj9+vGbPnl2u+wMAACquK7oQ/Oeff1br1q0lSc2bN5enp6eGDx9+1b6HbsqUKQoJCdGcOXPMbb+/ZsowDE2bNk0vvPCC7r//fknSBx98oMDAQC1dulR9+vTR3r17tWrVKm3dulVt2rSRJL355pvq0aOHXn/9dQUHB2vevHk6e/as3n//fXl4eOjWW2/Vzp079cYbbziEKwAAcP26oiNNRUVF8vDwMJ+7u7urevXq5d5UiWXLlqlNmzZ6+OGHFRAQoNtvv13vvvuuOZ6ZmamsrCxFRESY23x9fdWuXTulp6dLktLT0+Xn52cGJkmKiIiQm5ubNm/ebNZ07NjRYd8iIyO1f/9+nThx4oK9FRQUyG63OzwAAEDldUVHmgzD0KBBg+Tp6SlJOnPmjIYNG6Zq1ao51C1evLhcmvvvf/+rWbNmacSIEfrHP/6hrVu36qmnnpKHh4cGDhyorKwsSVJgYKDD6wIDA82xrKwsBQQEOIy7u7vL39/foeaPn/ormTMrK8vhdGCJSZMmacKECeWynwAAwPVdUWgaOHCgw/P+/fuXazN/VFxcrDZt2phfAnz77bdr9+7dSk5OLtXLtZaQkKARI0aYz+12u0JCQpzYEQAAuJquKDT9/tqia6Fu3boKCwtz2NasWTN9+umnks5/nYskZWdnq27dumZNdna2WrVqZdbk5OQ4zHHu3Dnl5uaarw8KClJ2drZDTcnzkpo/8vT0NI+4AQCAyq9MN7e8Vu666y7t37/fYdv333+vBg0aSDp/UXhQUJDWrl1rjtvtdm3evFnh4eGSpPDwcOXl5SkjI8OsWbdunYqLi9WuXTuzJi0tTYWFhWZNamqqmjRpcsFTcwAA4Prj0qFp+PDh2rRpk1599VUdOHBA8+fP1+zZsxUXFyfp/J3Jn3nmGb388statmyZvv32Wz3yyCMKDg5WdHS0pPNHprp166YhQ4Zoy5Yt2rBhg+Lj49WnTx8FBwdLkvr16ycPDw/FxsZqz549WrBggaZPn+5w+g0AAFzfyvTdc9fKX/7yFy1ZskQJCQmaOHGiQkNDNW3aNMXExJg1o0aN0qlTpzR06FDl5eWpQ4cOWrVqlby8vMyaefPmKT4+Xl26dJGbm5t69uypGTNmmOO+vr5as2aN4uLi1Lp1a9WuXVtjx47ldgMAAMBkMwzDcHYTlYHdbpevr6/y8/Pl4+NT7vM3HLOi3OesrA5OjnJ2C0CFxM8Z6/g5U3lcye9vlz49BwAA4CoITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGCBS9+nCXA2PoJtHR/BBlDZcaQJAADAAkITAACABYQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAAYAGhCQAAwAJCEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAAMACd2c3AAB/1HDMCme3UGEcnBzl7BaA6wZHmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAAMACQhMAAIAFhCYAAAALCE0AAAAWEJoAAAAsIDQBAABYQGgCAACwgNAEAABgAaEJAADAAkITAACABYQmAAAACwhNAAAAFlSo0DR58mTZbDY988wz5rYzZ84oLi5OtWrVUvXq1dWzZ09lZ2c7vO7QoUOKiorSDTfcoICAAI0cOVLnzp1zqPnqq690xx13yNPTU40bN1ZKSso12CMAAFBRVJjQtHXrVr3zzju67bbbHLYPHz5cn332mRYtWqSvv/5aR48e1YMPPmiOFxUVKSoqSmfPntXGjRs1d+5cpaSkaOzYsWZNZmamoqKidM8992jnzp165pln9Nhjj2n16tXXbP8AAIBrqxCh6eTJk4qJidG7776rmjVrmtvz8/P1z3/+U2+88Yb++te/qnXr1pozZ442btyoTZs2SZLWrFmj7777Th999JFatWql7t2766WXXtLMmTN19uxZSVJycrJCQ0OVlJSkZs2aKT4+Xg899JCmTp3qlP0FAACup0KEpri4OEVFRSkiIsJhe0ZGhgoLCx22N23aVPXr11d6erokKT09XS1atFBgYKBZExkZKbvdrj179pg1f5w7MjLSnONCCgoKZLfbHR4AAKDycnd2A5fz8ccfa/v27dq6dWupsaysLHl4eMjPz89he2BgoLKyssya3wemkvGSsUvV2O12/fbbb/L29i713pMmTdKECRPKvF8AAKBicekjTYcPH9bTTz+tefPmycvLy9ntOEhISFB+fr75OHz4sLNbAgAAV5FLh6aMjAzl5OTojjvukLu7u9zd3fX1119rxowZcnd3V2BgoM6ePau8vDyH12VnZysoKEiSFBQUVOrTdCXPL1fj4+NzwaNMkuTp6SkfHx+HBwAAqLxcOjR16dJF3377rXbu3Gk+2rRpo5iYGPPPVatW1dq1a83X7N+/X4cOHVJ4eLgkKTw8XN9++61ycnLMmtTUVPn4+CgsLMys+f0cJTUlcwAAALj0NU01atRQ8+bNHbZVq1ZNtWrVMrfHxsZqxIgR8vf3l4+Pj5588kmFh4erffv2kqSuXbsqLCxMAwYMUGJiorKysvTCCy8oLi5Onp6ekqRhw4bprbfe0qhRozR48GCtW7dOCxcu1IoVK67tDgMAAJfl0qHJiqlTp8rNzU09e/ZUQUGBIiMj9fbbb5vjVapU0fLly/XEE08oPDxc1apV08CBAzVx4kSzJjQ0VCtWrNDw4cM1ffp03XjjjXrvvfcUGRnpjF0CAAAuqMKFpq+++srhuZeXl2bOnKmZM2de9DUNGjTQ559/fsl5O3furB07dpRHiwAAoBJy6WuaAAAAXAWhCQAAwAJCEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAAMACQhMAAIAFhCYAAAALCE0AAAAWEJoAAAAsIDQBAABYQGgCAACwgNAEAABgAaEJAADAAkITAACABYQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAAYAGhCQAAwAJCEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGCBu7MbAADgetVwzApnt1ChHJwc5dT350gTAACABYQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAAYIFLh6ZJkybpL3/5i2rUqKGAgABFR0dr//79DjVnzpxRXFycatWqperVq6tnz57Kzs52qDl06JCioqJ0ww03KCAgQCNHjtS5c+ccar766ivdcccd8vT0VOPGjZWSknK1dw8AAFQgLh2avv76a8XFxWnTpk1KTU1VYWGhunbtqlOnTpk1w4cP12effaZFixbp66+/1tGjR/Xggw+a40VFRYqKitLZs2e1ceNGzZ07VykpKRo7dqxZk5mZqaioKN1zzz3auXOnnnnmGT322GNavXr1Nd1fAADgulz6juCrVq1yeJ6SkqKAgABlZGSoY8eOys/P1z//+U/Nnz9ff/3rXyVJc+bMUbNmzbRp0ya1b99ea9as0XfffacvvvhCgYGBatWqlV566SWNHj1a48ePl4eHh5KTkxUaGqqkpCRJUrNmzbR+/XpNnTpVkZGR13y/AQCA63HpI01/lJ+fL0ny9/eXJGVkZKiwsFARERFmTdOmTVW/fn2lp6dLktLT09WiRQsFBgaaNZGRkbLb7dqzZ49Z8/s5SmpK5riQgoIC2e12hwcAAKi8KkxoKi4u1jPPPKO77rpLzZs3lyRlZWXJw8NDfn5+DrWBgYHKysoya34fmErGS8YuVWO32/Xbb79dsJ9JkybJ19fXfISEhPzpfQQAAK6rwoSmuLg47d69Wx9//LGzW5EkJSQkKD8/33wcPnzY2S0BAICryKWvaSoRHx+v5cuXKy0tTTfeeKO5PSgoSGfPnlVeXp7D0abs7GwFBQWZNVu2bHGYr+TTdb+v+eMn7rKzs+Xj4yNvb+8L9uTp6SlPT88/vW8AAKBicOkjTYZhKD4+XkuWLNG6desUGhrqMN66dWtVrVpVa9euNbft379fhw4dUnh4uCQpPDxc3377rXJycsya1NRU+fj4KCwszKz5/RwlNSVzAAAAuPSRpri4OM2fP1///ve/VaNGDfMaJF9fX3l7e8vX11exsbEaMWKE/P395ePjoyeffFLh4eFq3769JKlr164KCwvTgAEDlJiYqKysLL3wwguKi4szjxQNGzZMb731lkaNGqXBgwdr3bp1WrhwoVasWOG0fQcAAK7FpY80zZo1S/n5+ercubPq1q1rPhYsWGDWTJ06VX/729/Us2dPdezYUUFBQVq8eLE5XqVKFS1fvlxVqlRReHi4+vfvr0ceeUQTJ040a0JDQ7VixQqlpqaqZcuWSkpK0nvvvcftBgAAgMmljzQZhnHZGi8vL82cOVMzZ868aE2DBg30+eefX3Kezp07a8eOHVfcIwAAuD649JEmAAAAV0FoAgAAsIDQBAAAYAGhCQAAwAJCEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAAMACQhMAAIAFhCYAAAALCE0AAAAWEJoAAAAsIDQBAABYQGgCAACwgNAEAABgAaEJAADAAkITAACABYQmAAAACwhNAAAAFhCaAAAALCA0AQAAWEBoAgAAsIDQBAAAYAGhCQAAwAJCEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAAMACQhMAAIAFhCYAAAALCE0AAAAWEJoAAAAsIDT9wcyZM9WwYUN5eXmpXbt22rJli7NbAgAALoDQ9DsLFizQiBEjNG7cOG3fvl0tW7ZUZGSkcnJynN0aAABwMkLT77zxxhsaMmSIHn30UYWFhSk5OVk33HCD3n//fWe3BgAAnIzQ9H/Onj2rjIwMRUREmNvc3NwUERGh9PR0J3YGAABcgbuzG3AVx48fV1FRkQIDAx22BwYGat++faXqCwoKVFBQYD7Pz8+XJNnt9qvSX3HB6asyb2VUnv8NWHfrWHfnYN2do7zWnTW/Mlfjd2zJnIZhXLaW0FRGkyZN0oQJE0ptDwkJcUI3+D3fac7u4PrEujsH6+4crLtzXM11//XXX+Xr63vJGkLT/6ldu7aqVKmi7Oxsh+3Z2dkKCgoqVZ+QkKARI0aYz4uLi5Wbm6tatWrJZrNd9X6dzW63KyQkRIcPH5aPj4+z27lusO7Owbo7B+vuHNfbuhuGoV9//VXBwcGXrSU0/R8PDw+1bt1aa9euVXR0tKTzQWjt2rWKj48vVe/p6SlPT0+HbX5+ftegU9fi4+NzXfxP5WpYd+dg3Z2DdXeO62ndL3eEqQSh6XdGjBihgQMHqk2bNmrbtq2mTZumU6dO6dFHH3V2awAAwMkITb/Tu3dv/fLLLxo7dqyysrLUqlUrrVq1qtTF4QAA4PpDaPqD+Pj4C56OgyNPT0+NGzeu1ClKXF2su3Ow7s7BujsH635xNsPKZ+wAAACuc9zcEgAAwAJCEwAAgAWEJgAAAAsITQAAABYQmnBZM2fOVMOGDeXl5aV27dppy5Yt5tjs2bPVuXNn+fj4yGazKS8vz3mNVjIXW/fc3Fw9+eSTatKkiby9vVW/fn099dRT5vcf4s+51N/3xx9/XI0aNZK3t7fq1Kmj+++//4LfTYkrd6l1L2EYhrp37y6bzaalS5de+yYroUute+fOnWWz2Rwew4YNc2K3zkdowiUtWLBAI0aM0Lhx47R9+3a1bNlSkZGRysnJkSSdPn1a3bp10z/+8Q8nd1q5XGrdjx49qqNHj+r111/X7t27lZKSolWrVik2NtbZbVd4l/v73rp1a82ZM0d79+7V6tWrZRiGunbtqqKiIid3XrFdbt1LTJs27br4mqprxcq6DxkyRMeOHTMfiYmJTuzYBRjAJbRt29aIi4sznxcVFRnBwcHGpEmTHOq+/PJLQ5Jx4sSJa9xh5WR13UssXLjQ8PDwMAoLC69Vi5XSla77N998Y0gyDhw4cK1arJSsrPuOHTuMevXqGceOHTMkGUuWLHFCp5XL5da9U6dOxtNPP+2k7lwTR5pwUWfPnlVGRoYiIiLMbW5uboqIiFB6eroTO6vcyrLu+fn58vHxkbs796stqytd91OnTmnOnDkKDQ1VSEjItWy1UrGy7qdPn1a/fv00c+bMC36BOq6c1b/v8+bNU+3atdW8eXMlJCTo9OnTzmjXZRCacFHHjx9XUVFRqa+RCQwMVFZWlpO6qvyudN2PHz+ul156SUOHDr1WLVZKVtf97bffVvXq1VW9enWtXLlSqamp8vDwuNbtVhpW1n348OG68847df/99zujxUrJyrr369dPH330kb788kslJCToww8/VP/+/Z3Rrsvgn6VABWa32xUVFaWwsDCNHz/e2e1cF2JiYvT//t//07Fjx/T666+rV69e2rBhg7y8vJzdWqW0bNkyrVu3Tjt27HB2K9ed3/9DrEWLFqpbt666dOmiH3/8UY0aNXJiZ87DkSZcVO3atVWlShVlZ2c7bM/OzuYQ+VVkdd1//fVXdevWTTVq1NCSJUtUtWrVa91qpWJ13X19fXXzzTerY8eO+uSTT7Rv3z4tWbLkWrdbaVxu3detW6cff/xRfn5+cnd3N09B9+zZU507d3ZCx5VDWX6+t2vXTpJ04MCBq96fqyI04aI8PDzUunVrrV271txWXFystWvXKjw83ImdVW5W1t1ut6tr167y8PDQsmXLOMpRDsry990wDBmGoYKCgmvVZqVzuXUfM2aMdu3apZ07d5oPSZo6darmzJnjpK4rvrL8fS9Z+7p1616LFl2Ts69Eh2v7+OOPDU9PTyMlJcX47rvvjKFDhxp+fn5GVlaWYRiGcezYMWPHjh3Gu+++a0gy0tLSjB07dhj/+9//nNx5xXapdc/PzzfatWtntGjRwjhw4IBx7Ngx83Hu3Dlnt16hXWrdf/zxR+PVV181tm3bZvz000/Ghg0bjHvvvdfw9/c3srOznd16hXa5nzN/JD49Vy4ute4HDhwwJk6caGzbts3IzMw0/v3vfxs33XST0bFjR2e37VSEJlzWm2++adSvX9/w8PAw2rZta2zatMkcGzdunCGp1GPOnDnOa7iSuNi6l9ze4UKPzMxM5zZdCVxs3Y8cOWJ0797dCAgIMKpWrWrceOONRr9+/Yx9+/Y5uePK4VI/Z/6I0FR+Lrbuhw4dMjp27Gj4+/sbnp6eRuPGjY2RI0ca+fn5Tu7YuWyGYRjOOMIFAABQkXBNEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAriKbzaalS5c6uw0A5YDQBKBSGjRokGw2m4YNG1ZqLC4uTjabTYMGDSq39xs/frxatWpVbvMBcD2EJgCVVkhIiD7++GP99ttv5rYzZ85o/vz5ql+/vhM7A1AREZoAVFp33HGHQkJCtHjxYnPb4sWLVb9+fd1+++3mtoKCAj311FMKCAiQl5eXOnTooK1bt5rjX331lWw2m9auXas2bdrohhtu0J133qn9+/dLklJSUjRhwgR98803stlsstlsSklJMV9//PhxPfDAA7rhhht08803a9myZVd/5wGUO0ITgEpt8ODBmjNnjvn8/fff16OPPupQM2rUKH366aeaO3eutm/frsaNGysyMlK5ubkOdc8//7ySkpK0bds2ubu7a/DgwZKk3r1769lnn9Wtt96qY8eO6dixY+rdu7f5ugkTJqhXr17atWuXevTooZiYmFJzA3B9hCYAlVr//v21fv16/fTTT/rpp5+0YcMG9e/f3xw/deqUZs2apddee03du3dXWFiY3n33XXl7e+uf//ynw1yvvPKKOnXqpLCwMI0ZM0YbN27UmTNn5O3trerVq8vd3V1BQUEKCgqSt7e3+bpBgwapb9++aty4sV599VWdPHlSW7ZsuWZrAKB8uDu7AQC4murUqaOoqCilpKTIMAxFRUWpdu3a5viPP/6owsJC3XXXXea2qlWrqm3bttq7d6/DXLfddpv557p160qScnJyLnt91O9fV61aNfn4+CgnJ+dP7ReAa4/QBKDSGzx4sOLj4yVJM2fOLPM8VatWNf9ss9kkScXFxVf0upLXWnkdANfC6TkAlV63bt109uxZFRYWKjIy0mGsUaNG8vDw0IYNG8xthYWF2rp1q8LCwiy/h4eHh4qKisqtZwCuhyNNACq9KlWqmKfaqlSp4jBWrVo1PfHEExo5cqT8/f1Vv359JSYm6vTp04qNjbX8Hg0bNlRmZqZ27typG2+8UTVq1JCnp2e57gcA5yI0Abgu+Pj4XHRs8uTJKi4u1oABA/Trr7+qTZs2Wr16tWrWrGl5/p49e2rx4sW65557lJeXpzlz5pTrzTMBOJ/NMAzD2U0AAAC4Oq5pAgAAsIDQBAAAYAGhCQAAwAJCEwAAgAWEJgAAAAsITQAAABYQmgAAACwgNAEAAFhAaAIAALCA0AQAAGABoQkAAMACQhMAAIAF/x+HyjFN/tpwHgAAAABJRU5ErkJggg==", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Sample Generated Python Code ( Generated with Amazon Bedrock in previous step)\n", + "\n", + "import csv\n", + "from collections import defaultdict\n", + "import matplotlib.pyplot as plt\n", + " \n", + "revenue_by_month = defaultdict(int)\n", + "\n", + "with open('sales.csv', 'r') as f:\n", + " reader = csv.DictReader(f)\n", + " total_revenue = 0\n", + " max_revenue_product = None\n", + " max_revenue = 0\n", + " max_revenue_date = None\n", + "\n", + " for row in reader:\n", + " revenue = float(row['price']) * int(row['units_sold'])\n", + " total_revenue += revenue\n", + "\n", + " date = row['date']\n", + " month = date.split('-')[1]\n", + " revenue_by_month[month] += revenue\n", + "\n", + " if revenue > max_revenue:\n", + " max_revenue = revenue\n", + " max_revenue_product = row['product_id']\n", + " max_revenue_date = date\n", + "\n", + "print('Total revenue:', total_revenue)\n", + "print('Product with max revenue:', max_revenue_product)\n", + "print('Date with max revenue:', max_revenue_date)\n", + "'\n", + "# Plot 'Revenue by Month'\n", + "plt.bar(revenue_by_month.keys(), revenue_by_month.values())\n", + "plt.xlabel('Month')\n", + "plt.ylabel('Revenue')\n", + "plt.title('Revenue by Month')\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "64b08b3b", + "metadata": {}, + "source": [ + "## Conclusion\n", + "You have now experimented with using `boto3` SDK which provides a vanilla exposure to Amazon Bedrock API. Using this API you generate a python program to analyze and visualize given sales data'\n", + "\n", + "### Take aways\n", + "- Adapt this notebook to experiment with different models available through Amazon Bedrock such as Amazon Titan and AI21 Labs Jurassic models.\n", + "- Change the prompts to your specific usecase and evaluate the output of different models.\n", + "- Play with the token length to understand the latency and responsiveness of the service.\n", + "- Apply different prompt engineering principles to get better outputs.\n", + "\n", + "## Thank You" + ] + } + ], + "metadata": { + "availableInstances": [ + { + "_defaultOrder": 0, + "_isFastLaunch": true, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 4, + "name": "ml.t3.medium", + "vcpuNum": 2 + }, + { + "_defaultOrder": 1, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.t3.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 2, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.t3.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 3, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.t3.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 4, + "_isFastLaunch": true, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.m5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 5, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.m5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 6, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.m5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 7, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.m5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 8, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.m5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 9, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.m5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 10, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.m5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 11, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.m5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 12, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.m5d.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 13, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.m5d.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 14, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.m5d.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 15, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.m5d.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 16, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.m5d.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 17, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.m5d.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 18, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.m5d.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 19, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.m5d.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 20, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": true, + "memoryGiB": 0, + "name": "ml.geospatial.interactive", + "supportedImageNames": [ + "sagemaker-geospatial-v1-0" + ], + "vcpuNum": 0 + }, + { + "_defaultOrder": 21, + "_isFastLaunch": true, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 4, + "name": "ml.c5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 22, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.c5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 23, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.c5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 24, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.c5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 25, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 72, + "name": "ml.c5.9xlarge", + "vcpuNum": 36 + }, + { + "_defaultOrder": 26, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 96, + "name": "ml.c5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 27, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 144, + "name": "ml.c5.18xlarge", + "vcpuNum": 72 + }, + { + "_defaultOrder": 28, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.c5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 29, + "_isFastLaunch": true, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.g4dn.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 30, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.g4dn.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 31, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.g4dn.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 32, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.g4dn.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 33, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.g4dn.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 34, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.g4dn.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 35, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 61, + "name": "ml.p3.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 36, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 244, + "name": "ml.p3.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 37, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 488, + "name": "ml.p3.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 38, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.p3dn.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 39, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.r5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 40, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.r5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 41, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.r5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 42, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.r5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 43, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.r5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 44, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.r5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 45, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 512, + "name": "ml.r5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 46, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.r5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 47, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.g5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 48, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.g5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 49, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.g5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 50, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.g5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 51, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.g5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 52, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.g5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 53, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.g5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 54, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.g5.48xlarge", + "vcpuNum": 192 + }, + { + "_defaultOrder": 55, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 1152, + "name": "ml.p4d.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 56, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 1152, + "name": "ml.p4de.24xlarge", + "vcpuNum": 96 + } + ], + "instance_type": "ml.t3.medium", + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.8" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/06_CodeGeneration/01_sql_query_generate_w_bedrock.ipynb b/06_CodeGeneration/01_sql_query_generate_w_bedrock.ipynb new file mode 100644 index 00000000..e3961740 --- /dev/null +++ b/06_CodeGeneration/01_sql_query_generate_w_bedrock.ipynb @@ -0,0 +1,1025 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "dc40c48b-0c95-4757-a067-563cfccd51a5", + "metadata": { + "tags": [] + }, + "source": [ + "# Invoke Bedrock model for SQL Query Generation\n", + "\n", + "> *This notebook should work well with the **`Data Science 3.0`** kernel in SageMaker Studio*" + ] + }, + { + "cell_type": "markdown", + "id": "c9a413e2-3c34-4073-9000-d8556537bb6a", + "metadata": {}, + "source": [ + "## Introduction\n", + "\n", + "In this notebook we show you how to use a LLM to generate SQL Query to analyze Sales data.\n", + "\n", + "We will use Bedrock's Claude V2 model using the Boto3 API. \n", + "\n", + "The prompt used in this example is called a zero-shot prompt because we are not providing any examples of text other than the prompt.\n", + "\n", + "**Note:** *This notebook can be run within or outside of AWS environment.*\n", + "\n", + "#### Context\n", + "To demonstrate the SQL code generation capability of Amazon Bedrock, we will explore the use of Boto3 client to communicate with Amazon Bedrock API. We will demonstrate different configurations available as well as how simple input can lead to desired outputs.\n", + "\n", + "#### Pattern\n", + "We will simply provide the Amazon Bedrock API with an input consisting of a task, an instruction and an input for the model under the hood to generate an output without providing any additional example. The purpose here is to demonstrate how the powerful LLMs easily understand the task at hand and generate compelling outputs.\n", + "\n", + "![](./images/bedrock-code-gen.png)\n", + "\n", + "#### Use case\n", + "Let's take the use case to generate SQL queries to analyze sales data, focusing on trends, top products and average sales.\n", + "\n", + "#### Persona\n", + "Maya is a business analyst, at AnyCompany primarily focusing on sales and inventory data. She is transitioning from Speadsheet analysis to data-driven analysis and want to use SQL to fetch specific data points effectively. She wants to use LLMs to generate SQL queries for her analysis. \n", + "\n", + "#### Implementation\n", + "To fulfill this use case, in this notebook we will show how to generate SQL queries. We will use the Anthropic Claude v2 model using the Amazon Bedrock API with Boto3 client. " + ] + }, + { + "cell_type": "markdown", + "id": "64baae27-2660-4a1e-b2e5-3de49d069362", + "metadata": {}, + "source": [ + "## Setup\n", + "\n", + "Before running the rest of this notebook, you'll need to run the cells below to (ensure necessary libraries are installed and) connect to Bedrock.\n", + "\n", + "For more details on how the setup works and ⚠️ **whether you might need to make any changes**, refer to the [Bedrock boto3 setup notebook](../00_Intro/bedrock_boto3_setup.ipynb) notebook." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "38b791ad-e6c5-4da5-96af-5c356a36e19d", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "# Make sure you ran `download-dependencies.sh` from the root of the repository first!\n", + "%pip install --no-build-isolation --force-reinstall \\\n", + " ../dependencies/awscli-*-py3-none-any.whl \\\n", + " ../dependencies/boto3-*-py3-none-any.whl \\\n", + " ../dependencies/botocore-*-py3-none-any.whl\n", + "\n", + "%pip install --quiet langchain==0.0.249" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "776fd083", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "import json\n", + "import os\n", + "import sys\n", + "\n", + "import boto3\n", + "\n", + "module_path = \"..\"\n", + "sys.path.append(os.path.abspath(module_path))\n", + "from utils import bedrock, print_ww\n", + "\n", + "\n", + "# ---- ⚠️ Un-comment and edit the below lines as needed for your AWS setup ⚠️ ----\n", + "\n", + "os.environ[\"AWS_DEFAULT_REGION\"] = \"us-east-1\" # E.g. \"us-east-1\"\n", + "os.environ[\"AWS_PROFILE\"] = \"fine-tuning-bedrock\"\n", + "# os.environ[\"BEDROCK_ASSUME_ROLE\"] = \"\" # E.g. \"arn:aws:...\"\n", + "# os.environ[\"BEDROCK_ENDPOINT_URL\"] = \"\" # E.g. \"https://...\"\n", + "\n", + "\n", + "boto3_bedrock = bedrock.get_bedrock_client(\n", + " assumed_role=os.environ.get(\"BEDROCK_ASSUME_ROLE\", None),\n", + " endpoint_url=os.environ.get(\"BEDROCK_ENDPOINT_URL\", None),\n", + " region=os.environ.get(\"AWS_DEFAULT_REGION\", None),\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "4f634211-3de1-4390-8c3f-367af5554c39", + "metadata": {}, + "source": [ + "## Generate SQL Query\n", + "\n", + "Following on the use case explained above, let's prepare an input for the Amazon Bedrock service to generate SQL query." + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "id": "45ee2bae-6415-4dba-af98-a19028305c98", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "# create the prompt to generate SQL query\n", + "prompt_data = \"\"\"\n", + "Command: Human: AnyCompany has a database with a table named sales_data containing sales records. The table has following columns:\n", + "- date (YYYY-MM-DD)\n", + "- product_id\n", + "- price\n", + "- units_sold\n", + "\n", + "Can you generate SQL queries for below: \n", + "- Identify the top 5 best selling products by total sales for the year 2023\n", + "- Calculate the monthly average sales for the year 2023\n", + "\n", + "Assistant:\n", + "\"\"\"\n" + ] + }, + { + "cell_type": "markdown", + "id": "cc9784e5-5e9d-472d-8ef1-34108ee4968b", + "metadata": {}, + "source": [ + "Let's start by using the Anthorpic Claude v2 model. " + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "id": "8af670eb-ad02-40df-a19c-3ed835fac8d9", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "# Claude - Body Syntex\n", + "body = json.dumps({\n", + " \"prompt\": prompt_data,\n", + " \"max_tokens_to_sample\":4096,\n", + " \"temperature\":0.5,\n", + " \"top_k\":250,\n", + " \"top_p\":0.5,\n", + " \"stop_sequences\": [\"\\n\\nHuman:\"]\n", + " }) " + ] + }, + { + "cell_type": "markdown", + "id": "c4ca6751", + "metadata": {}, + "source": [ + "The Amazon Bedrock API provides you with an API `invoke_model` which accepts the following:\n", + "- `modelId`: This is the model ARN for the various foundation models available under Amazon Bedrock\n", + "- `accept`: The type of input request\n", + "- `contentType`: The content type of the output\n", + "- `body`: A json string consisting of the prompt and the configurations\n", + "\n", + "Available text generation models under Amazon Bedrock have the following IDs:\n", + "- `amazon.titan-tg1-large`\n", + "- `amazon.titan-e1t-medium`\n", + "- `ai21.j2-grande-instruct`\n", + "- `ai21.j2-jumbo-instruct`\n", + "- `ai21.j2-mid`\n", + "- `ai21.j2-ultra`\n", + "- `anthropic.claude-instant-v1`\n", + "- `anthropic.claude-v1`\n", + "- `anthropic.claude-v2`" + ] + }, + { + "cell_type": "markdown", + "id": "088cf6bf-dd73-4710-a0cc-6c11d220c431", + "metadata": {}, + "source": [ + "#### Invoke the Bedrock's Claude Large Large language model" + ] + }, + { + "cell_type": "markdown", + "id": "379498f2", + "metadata": {}, + "source": [ + "First, we explore how the model generates an output based on the prompt created earlier.\n", + "\n", + "##### Complete Output Generation" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "id": "ecaceef1-0f7f-4ae5-8007-ff7c25335251", + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Here are the SQL queries to answer the questions:\n", + "\n", + "1. Identify the top 5 best selling products by total sales for the year 2023:\n", + "\n", + "```sql\n", + "SELECT product_id, SUM(price * units_sold) AS total_sales\n", + "FROM sales_data\n", + "WHERE date BETWEEN '2023-01-01' AND '2023-12-31'\n", + "GROUP BY product_id\n", + "ORDER BY total_sales DESC\n", + "LIMIT 5;\n", + "```\n", + "\n", + "2. Calculate the monthly average sales for the year 2023:\n", + "\n", + "```sql\n", + "SELECT\n", + " DATE_FORMAT(date, '%Y-%m') AS month,\n", + " AVG(price * units_sold) AS avg_monthly_sales\n", + "FROM sales_data\n", + "WHERE date BETWEEN '2023-01-01' AND '2023-12-31'\n", + "GROUP BY month\n", + "ORDER BY month;\n", + "```\n", + "\n", + "The first query groups the sales data by product_id, sums the total sales for each product, filters\n", + "for 2023 data only, orders by the total sales in descending order and limits to the top 5 results.\n", + "\n", + "The second query extracts the month from the date, calculates the average monthly sales by\n", + "aggregating on the month and ordering the results chronologically.\n" + ] + } + ], + "source": [ + "modelId = 'anthropic.claude-v2' # change this to use a different version from the model provider\n", + "accept = 'application/json'\n", + "contentType = 'application/json'\n", + "\n", + "response = boto3_bedrock.invoke_model(body=body, modelId=modelId, accept=accept, contentType=contentType)\n", + "response_body = json.loads(response.get('body').read())\n", + "\n", + "print_ww(response_body.get('completion'))" + ] + }, + { + "cell_type": "markdown", + "id": "078b9db4", + "metadata": {}, + "source": [ + "### Advanced Example\n", + "#### Understanding Hospital's Patient Management System through SQL" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "id": "d439b90c", + "metadata": {}, + "outputs": [], + "source": [ + "# create the prompt\n", + "prompt_sql_data = \"\"\"Command: You're provided with a database schema representing any hospital's patient management system.\n", + "The system holds records about patients, their prescriptions, doctors, and the medications prescribed.\n", + "\n", + "Here's the schema:\n", + "\n", + "```sql\n", + "CREATE TABLE Patients (\n", + " PatientID int,\n", + " FirstName varchar(50),\n", + " LastName varchar(50),\n", + " DateOfBirth datetime,\n", + " Gender varchar(10),\n", + " PRIMARY KEY (PatientID)\n", + ");\n", + "\n", + "CREATE TABLE Doctors (\n", + " DoctorID int,\n", + " FirstName varchar(50),\n", + " LastName varchar(50),\n", + " Specialization varchar(50),\n", + " PRIMARY KEY (DoctorID)\n", + ");\n", + "\n", + "CREATE TABLE Prescriptions (\n", + " PrescriptionID int,\n", + " PatientID int,\n", + " DoctorID int,\n", + " DateIssued datetime,\n", + " PRIMARY KEY (PrescriptionID)\n", + ");\n", + "\n", + "CREATE TABLE Medications (\n", + " MedicationID int,\n", + " MedicationName varchar(50),\n", + " Dosage varchar(50),\n", + " PRIMARY KEY (MedicationID)\n", + ");\n", + "\n", + "CREATE TABLE PrescriptionDetails (\n", + " PrescriptionDetailID int,\n", + " PrescriptionID int,\n", + " MedicationID int,\n", + " Quantity int,\n", + " PRIMARY KEY (PrescriptionDetailID)\n", + ");\n", + "```\n", + "\n", + "Write a SQL query that fetches all the patients who were prescribed more than 5 different medications on 2023-04-01.\n", + "\n", + "Assistant:\n", + "\"\"\"\n" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "id": "9afa3431", + "metadata": {}, + "outputs": [], + "source": [ + "# Claude - Body Syntex\n", + "body = json.dumps({\n", + " \"prompt\": prompt_sql_data,\n", + " \"max_tokens_to_sample\":4096,\n", + " \"temperature\":0.5,\n", + " \"top_k\":250,\n", + " \"top_p\":0.5,\n", + " \"stop_sequences\": [\"\\n\\nHuman:\"]\n", + " }) " + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "id": "5c45f4fc", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Here is a SQL query to fetch patients who were prescribed more than 5 medications on 2023-04-01:\n", + "\n", + "```sql\n", + "SELECT p.FirstName, p.LastName\n", + "FROM Patients p\n", + "JOIN Prescriptions pre ON p.PatientID = pre.PatientID\n", + "JOIN PrescriptionDetails pd ON pre.PrescriptionID = pd.PrescriptionID\n", + "WHERE pre.DateIssued = '2023-04-01'\n", + "GROUP BY p.PatientID\n", + "HAVING COUNT(DISTINCT pd.MedicationID) > 5;\n", + "```\n", + "\n", + "The key steps are:\n", + "\n", + "1. Join the Patients, Prescriptions and PrescriptionDetails tables to connect patients with their\n", + "prescriptions and medication details.\n", + "\n", + "2. Filter to only prescriptions issued on 2023-04-01.\n", + "\n", + "3. Group by PatientID and count the distinct MedicationIDs per patient.\n", + "\n", + "4. Use HAVING to only keep patients with more than 5 distinct medications.\n", + "\n", + "This will return all patients who had prescriptions for more than 5 different medications on the\n", + "given date.\n" + ] + } + ], + "source": [ + "modelId = 'anthropic.claude-v2' # change this to use a different version from the model provider\n", + "accept = 'application/json'\n", + "contentType = 'application/json'\n", + "\n", + "response = boto3_bedrock.invoke_model(body=body, modelId=modelId, accept=accept, contentType=contentType)\n", + "response_body = json.loads(response.get('body').read())\n", + "\n", + "print_ww(response_body.get('completion'))" + ] + }, + { + "cell_type": "markdown", + "id": "64b08b3b", + "metadata": {}, + "source": [ + "## Conclusion\n", + "You have now experimented with using `boto3` SDK which provides a vanilla exposure to Amazon Bedrock API. Using this API you have seen the use cases of generate SQL queries to analyze sales data.\n", + "\n", + "### Take aways\n", + "- Adapt this notebook to experiment with different models available through Amazon Bedrock such as Anthropic Claude and AI21 Labs Jurassic models.\n", + "- Change the prompts to your specific usecase and evaluate the output of different models.\n", + "- Play with the token length to understand the latency and responsiveness of the service.\n", + "- Apply different prompt engineering principles to get better outputs.\n", + "\n", + "## Thank You" + ] + } + ], + "metadata": { + "availableInstances": [ + { + "_defaultOrder": 0, + "_isFastLaunch": true, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 4, + "name": "ml.t3.medium", + "vcpuNum": 2 + }, + { + "_defaultOrder": 1, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.t3.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 2, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.t3.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 3, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.t3.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 4, + "_isFastLaunch": true, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.m5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 5, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.m5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 6, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.m5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 7, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.m5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 8, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.m5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 9, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.m5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 10, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.m5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 11, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.m5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 12, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.m5d.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 13, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.m5d.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 14, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.m5d.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 15, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.m5d.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 16, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.m5d.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 17, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.m5d.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 18, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.m5d.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 19, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.m5d.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 20, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": true, + "memoryGiB": 0, + "name": "ml.geospatial.interactive", + "supportedImageNames": [ + "sagemaker-geospatial-v1-0" + ], + "vcpuNum": 0 + }, + { + "_defaultOrder": 21, + "_isFastLaunch": true, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 4, + "name": "ml.c5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 22, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.c5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 23, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.c5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 24, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.c5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 25, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 72, + "name": "ml.c5.9xlarge", + "vcpuNum": 36 + }, + { + "_defaultOrder": 26, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 96, + "name": "ml.c5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 27, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 144, + "name": "ml.c5.18xlarge", + "vcpuNum": 72 + }, + { + "_defaultOrder": 28, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.c5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 29, + "_isFastLaunch": true, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.g4dn.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 30, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.g4dn.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 31, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.g4dn.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 32, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.g4dn.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 33, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.g4dn.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 34, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.g4dn.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 35, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 61, + "name": "ml.p3.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 36, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 244, + "name": "ml.p3.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 37, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 488, + "name": "ml.p3.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 38, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.p3dn.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 39, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.r5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 40, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.r5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 41, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.r5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 42, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.r5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 43, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.r5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 44, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.r5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 45, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 512, + "name": "ml.r5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 46, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.r5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 47, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.g5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 48, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.g5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 49, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.g5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 50, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.g5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 51, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.g5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 52, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.g5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 53, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.g5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 54, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.g5.48xlarge", + "vcpuNum": 192 + }, + { + "_defaultOrder": 55, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 1152, + "name": "ml.p4d.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 56, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 1152, + "name": "ml.p4de.24xlarge", + "vcpuNum": 96 + } + ], + "instance_type": "ml.t3.medium", + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.8" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/06_CodeGeneration/02_code_interpret_w_langchain.ipynb b/06_CodeGeneration/02_code_interpret_w_langchain.ipynb new file mode 100644 index 00000000..1e2d55e9 --- /dev/null +++ b/06_CodeGeneration/02_code_interpret_w_langchain.ipynb @@ -0,0 +1,967 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "af3f88dd-0f5e-427e-84ee-8934982300d1", + "metadata": { + "tags": [] + }, + "source": [ + "# Bedrock with LangChain using a Prompt that includes Context\n", + "\n", + "> *This notebook should work well with the **`Data Science 3.0`** kernel in SageMaker Studio*" + ] + }, + { + "cell_type": "markdown", + "id": "b920ca4a-a71d-4630-a6e4-577d95192ad1", + "metadata": {}, + "source": [ + "## Introduction\n", + "\n", + "In this notebook we show you how to explain or interpret a given code snippet or program.\n", + "\n", + "[LangChain](https://python.langchain.com/docs/get_started/introduction.html) is a framework for developing applications powered by language models. The key aspects of this framework allow us to augment the Large Language Models by chaining together various components to create advanced use cases.\n", + "\n", + "In this notebook we will use the Bedrock API provided by LangChain. The prompt used in this example creates a custom LangChain prompt template for adding context to the code explain request. \n", + "\n", + "**Note:** *This notebook can be run within or outside of AWS environment.*\n", + "\n", + "#### Context\n", + "In the previous example `01_sql_query_generation_w_bedrock.ipynb`, we explored how to use Bedrock API. LangChain framework to communicate with Amazon Bedrock API. In this notebook we will try to add a bit more complexity with the help of `PromptTemplates` to leverage the LangChain framework for the similar use case. `PrompTemplates` allow you to create generic shells which can be populated with information later and get model outputs based on different scenarios.\n", + "\n", + "As part of this notebook we will explore the use of Amazon Bedrock integration within LangChain framework and how it could be used to generate text with the help of `PromptTemplate`.\n", + "\n", + "#### Pattern\n", + "We will simply provide the LangChain implementation of Amazon Bedrock API with an input consisting of a task, an instruction and an input for the model under the hood to generate an output without providing any additional example. The purpose here is to demonstrate how the powerful LLMs easily understand the task at hand and generate compelling outputs.\n", + "\n", + "![](./images/bedrock-code-gen-langchain.png)\n", + "\n", + "#### Use case\n", + "To demonstrate the generation capability of models in Amazon Bedrock, let's take the use case of code explain.\n", + "\n", + "#### Persona\n", + "You are Joe, a Java software developer, has been tasked to support a legacy C++ application for Vehicle Fleet Management. You need help to explain or interpret certain complex C++ code snippets as you are performing analyis to identify the business logic and potential problems with the code.\n", + "\n", + "#### Implementation\n", + "To fulfill this use case, we will show you how you can Amazon Bedrock API with LangChain to explain C++ code snippets.\n" + ] + }, + { + "cell_type": "markdown", + "id": "aa11828a-243d-4808-9c92-e8caf4cebd37", + "metadata": {}, + "source": [ + "## Setup\n", + "\n", + "Before running the rest of this notebook, you'll need to run the cells below to (ensure necessary libraries are installed and) connect to Bedrock.\n", + "\n", + "For more details on how the setup works and ⚠️ **whether you might need to make any changes**, refer to the [Bedrock boto3 setup notebook](../00_Intro/bedrock_boto3_setup.ipynb) notebook.\n", + "\n", + "In this notebook, we'll also install the [Hugging Face Transformers](https://huggingface.co/docs/transformers/index) library which we'll use for counting the number of tokens in an input prompt." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "49e2c0a9-4838-4f2b-bb36-61c0cbcd62af", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "# Make sure you ran `download-dependencies.sh` from the root of the repository first!\n", + "%pip install --no-build-isolation --force-reinstall \\\n", + " ../dependencies/awscli-*-py3-none-any.whl \\\n", + " ../dependencies/boto3-*-py3-none-any.whl \\\n", + " ../dependencies/botocore-*-py3-none-any.whl\n", + "\n", + "%pip install --quiet langchain==0.0.249 \"transformers>=4.24,<5\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "558a9372-0789-414a-a1d7-2976056f2015", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "import json\n", + "import os\n", + "import sys\n", + "\n", + "import boto3\n", + "\n", + "module_path = \"..\"\n", + "sys.path.append(os.path.abspath(module_path))\n", + "from utils import bedrock, print_ww\n", + "\n", + "\n", + "# ---- ⚠️ Un-comment and edit the below lines as needed for your AWS setup ⚠️ ----\n", + "\n", + "os.environ[\"AWS_DEFAULT_REGION\"] = \"us-east-1\" # E.g. \"us-east-1\"\n", + "os.environ[\"AWS_PROFILE\"] = \"fine-tuning-bedrock\"\n", + "# os.environ[\"BEDROCK_ASSUME_ROLE\"] = \"\" # E.g. \"arn:aws:...\"\n", + "# os.environ[\"BEDROCK_ENDPOINT_URL\"] = \"\" # E.g. \"https://...\"\n", + "\n", + "\n", + "boto3_bedrock = bedrock.get_bedrock_client(\n", + " assumed_role=os.environ.get(\"BEDROCK_ASSUME_ROLE\", None),\n", + " endpoint_url=os.environ.get(\"BEDROCK_ENDPOINT_URL\", None),\n", + " region=os.environ.get(\"AWS_DEFAULT_REGION\", None),\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "b7daa1a8-d21a-410c-adbf-b253c2dabf80", + "metadata": { + "tags": [] + }, + "source": [ + "## Invoke the Bedrock LLM Model\n", + "\n", + "We'll begin with creating an instance of Bedrock class from llms. This expects a `model_id` which is the ARN of the model available in Amazon Bedrock. \n", + "\n", + "Optionally you can pass on a previously created boto3 client as well as some `model_kwargs` which can hold parameters such as `temperature`, `topP`, `maxTokenCount` or `stopSequences` (more on parameters can be explored in Amazon Bedrock console).\n", + "\n", + "Available text generation models under Amazon Bedrock have the following IDs:\n", + "\n", + "- amazon.titan-tg1-large\n", + "- ai21.j2-grande-instruct\n", + "- ai21.j2-jumbo-instruct\n", + "- anthropic.claude-instant-v1\n", + "- anthropic.claude-v1\n", + "\n", + "Note that different models support different `model_kwargs`." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "8ffa1250-56cd-4b6d-b3d8-c62baac143ce", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "from langchain.llms.bedrock import Bedrock\n", + "\n", + "inference_modifier = {'max_tokens_to_sample':4096, \n", + " \"temperature\":0.5,\n", + " \"top_k\":250,\n", + " \"top_p\":1,\n", + " \"stop_sequences\": [\"\\n\\nHuman\"]\n", + " }\n", + "\n", + "textgen_llm = Bedrock(model_id = \"anthropic.claude-v2\",\n", + " client = boto3_bedrock, \n", + " model_kwargs = inference_modifier \n", + " )\n" + ] + }, + { + "cell_type": "markdown", + "id": "de2678ed-f0d6-444f-9a57-5170dd1952f7", + "metadata": {}, + "source": [ + "## Create a LangChain custom prompt template\n", + "\n", + "By creating a template for the prompt we can pass it different input variables to it on every run. This is useful when you have to generate content with different input variables that you may be fetching from a database." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "96bc21b9", + "metadata": {}, + "outputs": [], + "source": [ + "# Vehicle Fleet Management Code written in C++\n", + "sample_code = \"\"\"\n", + "#include \n", + "#include \n", + "#include \n", + "\n", + "class Vehicle {\n", + "protected:\n", + " std::string registrationNumber;\n", + " int milesTraveled;\n", + " int lastMaintenanceMile;\n", + "\n", + "public:\n", + " Vehicle(std::string regNum) : registrationNumber(regNum), milesTraveled(0), lastMaintenanceMile(0) {}\n", + "\n", + " virtual void addMiles(int miles) {\n", + " milesTraveled += miles;\n", + " }\n", + "\n", + " virtual void performMaintenance() {\n", + " lastMaintenanceMile = milesTraveled;\n", + " std::cout << \"Maintenance performed for vehicle: \" << registrationNumber << std::endl;\n", + " }\n", + "\n", + " virtual void checkMaintenanceDue() {\n", + " if ((milesTraveled - lastMaintenanceMile) > 10000) {\n", + " std::cout << \"Vehicle: \" << registrationNumber << \" needs maintenance!\" << std::endl;\n", + " } else {\n", + " std::cout << \"No maintenance required for vehicle: \" << registrationNumber << std::endl;\n", + " }\n", + " }\n", + "\n", + " virtual void displayDetails() = 0;\n", + "\n", + " ~Vehicle() {\n", + " std::cout << \"Destructor for Vehicle\" << std::endl;\n", + " }\n", + "};\n", + "\n", + "class Truck : public Vehicle {\n", + " int capacityInTons;\n", + "\n", + "public:\n", + " Truck(std::string regNum, int capacity) : Vehicle(regNum), capacityInTons(capacity) {}\n", + "\n", + " void displayDetails() override {\n", + " std::cout << \"Truck with Registration Number: \" << registrationNumber << \", Capacity: \" << capacityInTons << \" tons.\" << std::endl;\n", + " }\n", + "};\n", + "\n", + "class Car : public Vehicle {\n", + " std::string model;\n", + "\n", + "public:\n", + " Car(std::string regNum, std::string carModel) : Vehicle(regNum), model(carModel) {}\n", + "\n", + " void displayDetails() override {\n", + " std::cout << \"Car with Registration Number: \" << registrationNumber << \", Model: \" << model << \".\" << std::endl;\n", + " }\n", + "};\n", + "\n", + "int main() {\n", + " std::vector fleet;\n", + "\n", + " fleet.push_back(new Truck(\"XYZ1234\", 20));\n", + " fleet.push_back(new Car(\"ABC9876\", \"Sedan\"));\n", + "\n", + " for (auto vehicle : fleet) {\n", + " vehicle->displayDetails();\n", + " vehicle->addMiles(10500);\n", + " vehicle->checkMaintenanceDue();\n", + " vehicle->performMaintenance();\n", + " vehicle->checkMaintenanceDue();\n", + " }\n", + "\n", + " for (auto vehicle : fleet) {\n", + " delete vehicle; \n", + " }\n", + "\n", + " return 0;\n", + "}\n", + "\"\"\"" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "id": "dbec103a-97ae-4e9e-9d80-dc20f354a228", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "from langchain import PromptTemplate\n", + "\n", + "# Create a prompt template that has multiple input variables\n", + "multi_var_prompt = PromptTemplate(\n", + " input_variables=[\"code\", \"programmingLanguage\"], \n", + " template=\"\"\"Human: You will be acting as an expert software developer in {programmingLanguage}. \n", + " You will explain below code and highlight if any red flags or not following best practices.\n", + " {code}\n", + " Assistant: \n", + " \"\"\"\n", + ")\n", + "\n", + "# Pass in values to the input variables\n", + "prompt = multi_var_prompt.format(code=sample_code, programmingLanguage=\"C++\")\n" + ] + }, + { + "cell_type": "markdown", + "id": "a5b76387", + "metadata": {}, + "source": [ + "### Explain C++ Code for Vehicle Fleet management using Amazon Bedrock and LangChain" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "id": "c1064c57-27a4-48c5-911b-e4f1dfeff122", + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Overall, the code follows good OOP design principles and uses inheritance appropriately. The Vehicle\n", + "base class contains common data members and methods, while Truck and Car derive from it and add\n", + "specific details.\n", + "\n", + "Some positives:\n", + "\n", + "- Uses protected inheritance correctly to allow derived classes access to base class members.\n", + "\n", + "- Uses virtual methods like displayDetails() to enable polymorphic behavior.\n", + "\n", + "- Uses smart pointers (unique_ptr) instead of raw pointers to manage memory and avoid leaks.\n", + "\n", + "- Uses override specifier to explicitly indicate overridden methods.\n", + "\n", + "- Uses a vector to store heterogeneous objects through a common base pointer.\n", + "\n", + "- Checks for maintenance due based on miles traveled.\n", + "\n", + "- No major red flags or bad practices noted.\n", + "\n", + "Some things that could be improved:\n", + "\n", + "- The base Vehicle class could use pure virtual methods instead of a mix of virtual and pure virtual\n", + "methods.\n", + "\n", + "- The Vehicle constructor initializes data members - should consider using member initializer list\n", + "instead.\n", + "\n", + "- Unique pointers could be used instead of raw pointers for automatic memory management.\n", + "\n", + "- The displayDetails() method could be renamed to something more specific like printDetails().\n", + "\n", + "- Comments could be added to explain parts of logic/flow.\n", + "\n", + "Overall the code is well written, follows OOP principles and does not have any major issues. Just a\n", + "few minor improvements/enhancements possible.\n" + ] + } + ], + "source": [ + "response = textgen_llm(prompt)\n", + "\n", + "code_explanation = response[response.index('\\n')+1:]\n", + "\n", + "print_ww(code_explanation)" + ] + }, + { + "cell_type": "markdown", + "id": "9e9abc40", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "To conclude we learnt that invoking the LLM without any context might not yield the desired results. By adding context and further using the the prompt template to constrain the output from the LLM we are able to successfully get our desired output" + ] + } + ], + "metadata": { + "availableInstances": [ + { + "_defaultOrder": 0, + "_isFastLaunch": true, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 4, + "name": "ml.t3.medium", + "vcpuNum": 2 + }, + { + "_defaultOrder": 1, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.t3.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 2, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.t3.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 3, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.t3.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 4, + "_isFastLaunch": true, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.m5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 5, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.m5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 6, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.m5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 7, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.m5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 8, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.m5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 9, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.m5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 10, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.m5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 11, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.m5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 12, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.m5d.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 13, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.m5d.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 14, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.m5d.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 15, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.m5d.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 16, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.m5d.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 17, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.m5d.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 18, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.m5d.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 19, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.m5d.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 20, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": true, + "memoryGiB": 0, + "name": "ml.geospatial.interactive", + "supportedImageNames": [ + "sagemaker-geospatial-v1-0" + ], + "vcpuNum": 0 + }, + { + "_defaultOrder": 21, + "_isFastLaunch": true, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 4, + "name": "ml.c5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 22, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.c5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 23, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.c5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 24, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.c5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 25, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 72, + "name": "ml.c5.9xlarge", + "vcpuNum": 36 + }, + { + "_defaultOrder": 26, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 96, + "name": "ml.c5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 27, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 144, + "name": "ml.c5.18xlarge", + "vcpuNum": 72 + }, + { + "_defaultOrder": 28, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.c5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 29, + "_isFastLaunch": true, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.g4dn.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 30, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.g4dn.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 31, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.g4dn.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 32, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.g4dn.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 33, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.g4dn.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 34, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.g4dn.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 35, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 61, + "name": "ml.p3.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 36, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 244, + "name": "ml.p3.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 37, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 488, + "name": "ml.p3.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 38, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.p3dn.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 39, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.r5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 40, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.r5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 41, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.r5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 42, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.r5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 43, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.r5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 44, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.r5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 45, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 512, + "name": "ml.r5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 46, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.r5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 47, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.g5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 48, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.g5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 49, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.g5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 50, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.g5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 51, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.g5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 52, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.g5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 53, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.g5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 54, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.g5.48xlarge", + "vcpuNum": 192 + }, + { + "_defaultOrder": 55, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 1152, + "name": "ml.p4d.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 56, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 1152, + "name": "ml.p4de.24xlarge", + "vcpuNum": 96 + } + ], + "instance_type": "ml.t3.medium", + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.8" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/06_CodeGeneration/03_code_translate_w_langchain.ipynb b/06_CodeGeneration/03_code_translate_w_langchain.ipynb new file mode 100644 index 00000000..8162b9e6 --- /dev/null +++ b/06_CodeGeneration/03_code_translate_w_langchain.ipynb @@ -0,0 +1,1132 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "af3f88dd-0f5e-427e-84ee-8934982300d1", + "metadata": { + "tags": [] + }, + "source": [ + "# Bedrock with LangChain - Code Translation from one programming language to another\n", + "\n", + "> *This notebook should work well with the **`Data Science 3.0`** kernel in SageMaker Studio*" + ] + }, + { + "cell_type": "markdown", + "id": "b920ca4a-a71d-4630-a6e4-577d95192ad1", + "metadata": {}, + "source": [ + "## Introduction\n", + "\n", + "In this notebook we show you how to generate an email response to a customer who was not happy with the quality of customer service that they received from the customer support engineer. We will provide additional context to the model by providing the contents of the actual email that was received from the unhappy customer.\n", + "\n", + "Because of additional context in the prompt, the text produced by the Amazon Titan Large language model in this notebook is of much better quality and relevance than the content produced earlier through zero-shot prompts.\n", + "\n", + "[LangChain](https://python.langchain.com/docs/get_started/introduction.html) is a framework for developing applications powered by language models. The key aspects of this framework allow us to augment the Large Language Models by chaining together various components to create advanced use cases.\n", + "\n", + "In this notebook we will use the Bedrock API provided by LangChain. The prompt used in this example creates a custom LangChain prompt template for adding context to the text generation request. \n", + "\n", + "**Note:** *This notebook can be run within or outside of AWS environment.*\n", + "\n", + "#### Context\n", + "In the previous example `02_code_interpret_w_langchain.ipynb`, we explored how to use LangChain framework to communicate with Amazon Bedrock API. In this notebook we will try to add a bit more complexity with the help of `PromptTemplates` to leverage the LangChain framework for the similar use case. `PrompTemplates` allow you to create generic shells which can be populated with information later and get model outputs based on different scenarios.\n", + "\n", + "As part of this notebook we will explore the use of Amazon Bedrock integration within LangChain framework and how it could be used to generate text with the help of `PromptTemplate`.\n", + "\n", + "#### Pattern\n", + "We will simply provide the LangChain implementation of Amazon Bedrock API with an input consisting of a task, an instruction and an input for the model under the hood to generate an output without providing any additional example. The purpose here is to demonstrate how the powerful LLMs easily understand the task at hand and generate compelling outputs.\n", + "\n", + "![](./images/bedrock-code-gen-langchain.png)\n", + "\n", + "#### Use case\n", + "To demonstrate the generation capability of models in Amazon Bedrock, let's take the use case of email generation.\n", + "\n", + "#### Persona\n", + "You are Bob a Customer Service Manager at AnyCompany and some of your customers are not happy with the customer service and are providing negative feedbacks on the service provided by customer support engineers. Now, you would like to respond to those customers humbly aplogizing for the poor service and regain trust. You need the help of an LLM to generate a bulk of emails for you which are human friendly and personalized to the customer's sentiment from previous email correspondence.\n", + "\n", + "#### Implementation\n", + "To fulfill this use case, we will show you how to generate an email with a thank you note based on the customer's previous email. We will use the Amazon Titan Text Large model using the Amazon Bedrock LangChain integration. \n" + ] + }, + { + "cell_type": "markdown", + "id": "aa11828a-243d-4808-9c92-e8caf4cebd37", + "metadata": {}, + "source": [ + "## Setup\n", + "\n", + "Before running the rest of this notebook, you'll need to run the cells below to (ensure necessary libraries are installed and) connect to Bedrock.\n", + "\n", + "For more details on how the setup works and ⚠️ **whether you might need to make any changes**, refer to the [Bedrock boto3 setup notebook](../00_Intro/bedrock_boto3_setup.ipynb) notebook.\n", + "\n", + "In this notebook, we'll also install the [Hugging Face Transformers](https://huggingface.co/docs/transformers/index) library which we'll use for counting the number of tokens in an input prompt." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "49e2c0a9-4838-4f2b-bb36-61c0cbcd62af", + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Processing /Users/mundabra/dev/bedrock/amazon-bedrock-workshop/dependencies/awscli-1.29.21-py3-none-any.whl\n", + "Processing /Users/mundabra/dev/bedrock/amazon-bedrock-workshop/dependencies/boto3-1.28.21-py3-none-any.whl\n", + "Processing /Users/mundabra/dev/bedrock/amazon-bedrock-workshop/dependencies/botocore-1.31.21-py3-none-any.whl\n", + "Collecting docutils<0.17,>=0.10\n", + " Using cached docutils-0.16-py2.py3-none-any.whl (548 kB)\n", + "Collecting s3transfer<0.7.0,>=0.6.0\n", + " Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)\n", + "Collecting PyYAML<6.1,>=3.10\n", + " Using cached PyYAML-6.0.1-cp310-cp310-macosx_10_9_x86_64.whl (189 kB)\n", + "Collecting rsa<4.8,>=3.1.2\n", + " Using cached rsa-4.7.2-py3-none-any.whl (34 kB)\n", + "Collecting colorama<0.4.5,>=0.2.5\n", + " Using cached colorama-0.4.4-py2.py3-none-any.whl (16 kB)\n", + "Collecting urllib3<1.27,>=1.25.4\n", + " Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)\n", + "Collecting jmespath<2.0.0,>=0.7.1\n", + " Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)\n", + "Collecting python-dateutil<3.0.0,>=2.1\n", + " Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)\n", + "Collecting six>=1.5\n", + " Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)\n", + "Collecting pyasn1>=0.1.3\n", + " Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)\n", + "Installing collected packages: urllib3, six, PyYAML, pyasn1, jmespath, docutils, colorama, rsa, python-dateutil, botocore, s3transfer, boto3, awscli\n", + " Attempting uninstall: urllib3\n", + " Found existing installation: urllib3 1.26.16\n", + " Uninstalling urllib3-1.26.16:\n", + " Successfully uninstalled urllib3-1.26.16\n", + " Attempting uninstall: six\n", + " Found existing installation: six 1.16.0\n", + " Uninstalling six-1.16.0:\n", + " Successfully uninstalled six-1.16.0\n", + " Attempting uninstall: PyYAML\n", + " Found existing installation: PyYAML 6.0.1\n", + " Uninstalling PyYAML-6.0.1:\n", + " Successfully uninstalled PyYAML-6.0.1\n", + " Attempting uninstall: pyasn1\n", + " Found existing installation: pyasn1 0.5.0\n", + " Uninstalling pyasn1-0.5.0:\n", + " Successfully uninstalled pyasn1-0.5.0\n", + " Attempting uninstall: jmespath\n", + " Found existing installation: jmespath 1.0.1\n", + " Uninstalling jmespath-1.0.1:\n", + " Successfully uninstalled jmespath-1.0.1\n", + " Attempting uninstall: docutils\n", + " Found existing installation: docutils 0.16\n", + " Uninstalling docutils-0.16:\n", + " Successfully uninstalled docutils-0.16\n", + " Attempting uninstall: colorama\n", + " Found existing installation: colorama 0.4.4\n", + " Uninstalling colorama-0.4.4:\n", + " Successfully uninstalled colorama-0.4.4\n", + " Attempting uninstall: rsa\n", + " Found existing installation: rsa 4.7.2\n", + " Uninstalling rsa-4.7.2:\n", + " Successfully uninstalled rsa-4.7.2\n", + " Attempting uninstall: python-dateutil\n", + " Found existing installation: python-dateutil 2.8.2\n", + " Uninstalling python-dateutil-2.8.2:\n", + " Successfully uninstalled python-dateutil-2.8.2\n", + " Attempting uninstall: botocore\n", + " Found existing installation: botocore 1.31.21\n", + " Uninstalling botocore-1.31.21:\n", + " Successfully uninstalled botocore-1.31.21\n", + " Attempting uninstall: s3transfer\n", + " Found existing installation: s3transfer 0.6.1\n", + " Uninstalling s3transfer-0.6.1:\n", + " Successfully uninstalled s3transfer-0.6.1\n", + " Attempting uninstall: boto3\n", + " Found existing installation: boto3 1.28.21\n", + " Uninstalling boto3-1.28.21:\n", + " Successfully uninstalled boto3-1.28.21\n", + " Attempting uninstall: awscli\n", + " Found existing installation: awscli 1.29.21\n", + " Uninstalling awscli-1.29.21:\n", + " Successfully uninstalled awscli-1.29.21\n", + "Successfully installed PyYAML-6.0.1 awscli-1.29.21 boto3-1.28.21 botocore-1.31.21 colorama-0.4.4 docutils-0.16 jmespath-1.0.1 pyasn1-0.5.0 python-dateutil-2.8.2 rsa-4.7.2 s3transfer-0.6.1 six-1.16.0 urllib3-1.26.16\n", + "\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip available: \u001b[0m\u001b[31;49m22.2.2\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m23.2.1\u001b[0m\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpython3.10 -m pip install --upgrade pip\u001b[0m\n", + "Note: you may need to restart the kernel to use updated packages.\n", + "\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip available: \u001b[0m\u001b[31;49m22.2.2\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m23.2.1\u001b[0m\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpython3.10 -m pip install --upgrade pip\u001b[0m\n", + "Note: you may need to restart the kernel to use updated packages.\n" + ] + } + ], + "source": [ + "# Make sure you ran `download-dependencies.sh` from the root of the repository first!\n", + "%pip install --no-build-isolation --force-reinstall \\\n", + " ../dependencies/awscli-*-py3-none-any.whl \\\n", + " ../dependencies/boto3-*-py3-none-any.whl \\\n", + " ../dependencies/botocore-*-py3-none-any.whl\n", + "\n", + "%pip install --quiet langchain==0.0.249 \"transformers>=4.24,<5\"" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "558a9372-0789-414a-a1d7-2976056f2015", + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Create new client\n", + " Using region: us-east-1\n", + " Using profile: fine-tuning-bedrock\n", + "boto3 Bedrock client successfully created!\n", + "bedrock(https://bedrock.us-east-1.amazonaws.com)\n" + ] + } + ], + "source": [ + "import json\n", + "import os\n", + "import sys\n", + "\n", + "import boto3\n", + "\n", + "module_path = \"..\"\n", + "sys.path.append(os.path.abspath(module_path))\n", + "from utils import bedrock, print_ww\n", + "\n", + "\n", + "# ---- ⚠️ Un-comment and edit the below lines as needed for your AWS setup ⚠️ ----\n", + "\n", + "os.environ[\"AWS_DEFAULT_REGION\"] = \"us-east-1\" # E.g. \"us-east-1\"\n", + "os.environ[\"AWS_PROFILE\"] = \"fine-tuning-bedrock\"\n", + "# os.environ[\"BEDROCK_ASSUME_ROLE\"] = \"\" # E.g. \"arn:aws:...\"\n", + "# os.environ[\"BEDROCK_ENDPOINT_URL\"] = \"\" # E.g. \"https://...\"\n", + "\n", + "\n", + "boto3_bedrock = bedrock.get_bedrock_client(\n", + " assumed_role=os.environ.get(\"BEDROCK_ASSUME_ROLE\", None),\n", + " endpoint_url=os.environ.get(\"BEDROCK_ENDPOINT_URL\", None),\n", + " region=os.environ.get(\"AWS_DEFAULT_REGION\", None),\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "b7daa1a8-d21a-410c-adbf-b253c2dabf80", + "metadata": { + "tags": [] + }, + "source": [ + "## Invoke the Bedrock LLM Model\n", + "\n", + "We'll begin with creating an instance of Bedrock class from llms. This expects a `model_id` which is the ARN of the model available in Amazon Bedrock. \n", + "\n", + "Optionally you can pass on a previously created boto3 client as well as some `model_kwargs` which can hold parameters such as `temperature`, `topP`, `maxTokenCount` or `stopSequences` (more on parameters can be explored in Amazon Bedrock console).\n", + "\n", + "Available text generation models under Amazon Bedrock have the following IDs:\n", + "\n", + "- amazon.titan-tg1-large\n", + "- ai21.j2-grande-instruct\n", + "- ai21.j2-jumbo-instruct\n", + "- anthropic.claude-instant-v1\n", + "- anthropic.claude-v1\n", + "\n", + "Note that different models support different `model_kwargs`." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "8ffa1250-56cd-4b6d-b3d8-c62baac143ce", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "from langchain.llms.bedrock import Bedrock\n", + "\n", + "inference_modifier = {'max_tokens_to_sample':4096, \n", + " \"temperature\":0.5,\n", + " \"top_k\":250,\n", + " \"top_p\":1,\n", + " \"stop_sequences\": [\"\\n\\nHuman\"]\n", + " }\n", + "\n", + "textgen_llm = Bedrock(model_id = \"anthropic.claude-v2\",\n", + " client = boto3_bedrock, \n", + " model_kwargs = inference_modifier \n", + " )\n" + ] + }, + { + "cell_type": "markdown", + "id": "de2678ed-f0d6-444f-9a57-5170dd1952f7", + "metadata": {}, + "source": [ + "## Create a LangChain custom prompt template\n", + "\n", + "By creating a template for the prompt we can pass it different input variables to it on every run. This is useful when you have to generate content with different input variables that you may be fetching from a database.\n", + "\n", + "Previously we hardcoded the prompt, it might be the case that you have multiple customers sending similar negative feedback and you now want to use each of those customer's emails and respond to them with an apology but you also want to keep the response a bit personalized. In the following cell we are exploring how you can create a `PromptTemplate` to achieve this pattern." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "96bc21b9", + "metadata": {}, + "outputs": [], + "source": [ + "# Vehicle Fleet Management Code written in C++\n", + "sample_code = \"\"\"\n", + "#include \n", + "#include \n", + "#include \n", + "\n", + "class Vehicle {\n", + "protected:\n", + " std::string registrationNumber;\n", + " int milesTraveled;\n", + " int lastMaintenanceMile;\n", + "\n", + "public:\n", + " Vehicle(std::string regNum) : registrationNumber(regNum), milesTraveled(0), lastMaintenanceMile(0) {}\n", + "\n", + " virtual void addMiles(int miles) {\n", + " milesTraveled += miles;\n", + " }\n", + "\n", + " virtual void performMaintenance() {\n", + " lastMaintenanceMile = milesTraveled;\n", + " std::cout << \"Maintenance performed for vehicle: \" << registrationNumber << std::endl;\n", + " }\n", + "\n", + " virtual void checkMaintenanceDue() {\n", + " if ((milesTraveled - lastMaintenanceMile) > 10000) {\n", + " std::cout << \"Vehicle: \" << registrationNumber << \" needs maintenance!\" << std::endl;\n", + " } else {\n", + " std::cout << \"No maintenance required for vehicle: \" << registrationNumber << std::endl;\n", + " }\n", + " }\n", + "\n", + " virtual void displayDetails() = 0;\n", + "\n", + " ~Vehicle() {\n", + " std::cout << \"Destructor for Vehicle\" << std::endl;\n", + " }\n", + "};\n", + "\n", + "class Truck : public Vehicle {\n", + " int capacityInTons;\n", + "\n", + "public:\n", + " Truck(std::string regNum, int capacity) : Vehicle(regNum), capacityInTons(capacity) {}\n", + "\n", + " void displayDetails() override {\n", + " std::cout << \"Truck with Registration Number: \" << registrationNumber << \", Capacity: \" << capacityInTons << \" tons.\" << std::endl;\n", + " }\n", + "};\n", + "\n", + "class Car : public Vehicle {\n", + " std::string model;\n", + "\n", + "public:\n", + " Car(std::string regNum, std::string carModel) : Vehicle(regNum), model(carModel) {}\n", + "\n", + " void displayDetails() override {\n", + " std::cout << \"Car with Registration Number: \" << registrationNumber << \", Model: \" << model << \".\" << std::endl;\n", + " }\n", + "};\n", + "\n", + "int main() {\n", + " std::vector fleet;\n", + "\n", + " fleet.push_back(new Truck(\"XYZ1234\", 20));\n", + " fleet.push_back(new Car(\"ABC9876\", \"Sedan\"));\n", + "\n", + " for (auto vehicle : fleet) {\n", + " vehicle->displayDetails();\n", + " vehicle->addMiles(10500);\n", + " vehicle->checkMaintenanceDue();\n", + " vehicle->performMaintenance();\n", + " vehicle->checkMaintenanceDue();\n", + " }\n", + "\n", + " for (auto vehicle : fleet) {\n", + " delete vehicle; \n", + " }\n", + "\n", + " return 0;\n", + "}\n", + "\"\"\"" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "dbec103a-97ae-4e9e-9d80-dc20f354a228", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "from langchain import PromptTemplate\n", + "\n", + "# Create a prompt template that has multiple input variables\n", + "multi_var_prompt = PromptTemplate(\n", + " input_variables=[\"code\", \"srcProgrammingLanguage\", \"targetProgrammingLanguage\"], \n", + " template=\"\"\"Human: You will be acting as an expert software developer in {srcProgrammingLanguage} and {targetProgrammingLanguage}. \n", + " You will tranlslate below code from {srcProgrammingLanguage} to {targetProgrammingLanguage} while following coding best practices.\n", + " {code}\n", + " Assistant: \n", + " \"\"\"\n", + ")\n", + "\n", + "# Pass in values to the input variables\n", + "prompt = multi_var_prompt.format(code=sample_code, srcProgrammingLanguage=\"C++\", targetProgrammingLanguage=\"Java\")\n" + ] + }, + { + "cell_type": "markdown", + "id": "a5b76387", + "metadata": {}, + "source": [ + "### Code translation from C++ to Java" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "c1064c57-27a4-48c5-911b-e4f1dfeff122", + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "```java\n", + "import java.util.ArrayList;\n", + "\n", + "class Vehicle {\n", + " protected String registrationNumber;\n", + " protected int milesTraveled;\n", + " protected int lastMaintenanceMile;\n", + "\n", + " public Vehicle(String regNum) {\n", + " this.registrationNumber = regNum;\n", + " this.milesTraveled = 0;\n", + " this.lastMaintenanceMile = 0;\n", + " }\n", + "\n", + " public void addMiles(int miles) {\n", + " this.milesTraveled += miles;\n", + " }\n", + "\n", + " public void performMaintenance() {\n", + " this.lastMaintenanceMile = this.milesTraveled;\n", + " System.out.println(\"Maintenance performed for vehicle: \" + this.registrationNumber);\n", + " }\n", + "\n", + " public void checkMaintenanceDue() {\n", + " if ((this.milesTraveled - this.lastMaintenanceMile) > 10000) {\n", + " System.out.println(\"Vehicle: \" + this.registrationNumber + \" needs maintenance!\");\n", + " } else {\n", + " System.out.println(\"No maintenance required for vehicle: \" + this.registrationNumber);\n", + " }\n", + " }\n", + "\n", + " public void displayDetails() {\n", + " // Implemented in subclasses\n", + " }\n", + "}\n", + "\n", + "class Truck extends Vehicle {\n", + " private int capacityInTons;\n", + "\n", + " public Truck(String regNum, int capacity) {\n", + " super(regNum);\n", + " this.capacityInTons = capacity;\n", + " }\n", + "\n", + " @Override\n", + " public void displayDetails() {\n", + " System.out.println(\"Truck with Registration Number: \" + this.registrationNumber + \",\n", + "Capacity: \" + this.capacityInTons + \" tons.\");\n", + " }\n", + "}\n", + "\n", + "class Car extends Vehicle {\n", + " private String model;\n", + "\n", + " public Car(String regNum, String carModel) {\n", + " super(regNum);\n", + " this.model = carModel;\n", + " }\n", + "\n", + " @Override\n", + " public void displayDetails() {\n", + " System.out.println(\"Car with Registration Number: \" + this.registrationNumber + \", Model: \"\n", + "+ this.model + \".\");\n", + " }\n", + "}\n", + "\n", + "public class Main {\n", + " public static void main(String[] args) {\n", + " ArrayList fleet = new ArrayList<>();\n", + "\n", + " fleet.add(new Truck(\"XYZ1234\", 20));\n", + " fleet.add(new Car(\"ABC9876\", \"Sedan\"));\n", + "\n", + " for (Vehicle vehicle : fleet) {\n", + " vehicle.displayDetails();\n", + " vehicle.addMiles(10500);\n", + " vehicle.checkMaintenanceDue();\n", + " vehicle.performMaintenance();\n", + " vehicle.checkMaintenanceDue();\n", + " }\n", + " }\n", + "}\n", + "```\n", + "\n", + "Key points:\n", + "\n", + "- Used ArrayList instead of raw vectors\n", + "- Overrode methods using @Override annotation\n", + "- Used access modifiers properly (private, public)\n", + "- Followed naming conventions and formatting standards\n", + "- Implemented polymorphic behavior using abstract class and subclasses\n", + "\n", + "Let me know if you have any other questions!\n" + ] + } + ], + "source": [ + "response = textgen_llm(prompt)\n", + "\n", + "target_code = response[response.index('\\n')+1:]\n", + "\n", + "print_ww(target_code)" + ] + }, + { + "cell_type": "markdown", + "id": "9e9abc40", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "To conclude we learnt that invoking the LLM without any context might not yield the desired results. By adding context and further using the the prompt template to constrain the output from the LLM we are able to successfully get our desired output" + ] + } + ], + "metadata": { + "availableInstances": [ + { + "_defaultOrder": 0, + "_isFastLaunch": true, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 4, + "name": "ml.t3.medium", + "vcpuNum": 2 + }, + { + "_defaultOrder": 1, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.t3.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 2, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.t3.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 3, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.t3.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 4, + "_isFastLaunch": true, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.m5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 5, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.m5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 6, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.m5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 7, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.m5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 8, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.m5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 9, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.m5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 10, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.m5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 11, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.m5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 12, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.m5d.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 13, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.m5d.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 14, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.m5d.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 15, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.m5d.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 16, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.m5d.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 17, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.m5d.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 18, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.m5d.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 19, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.m5d.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 20, + "_isFastLaunch": false, + "category": "General purpose", + "gpuNum": 0, + "hideHardwareSpecs": true, + "memoryGiB": 0, + "name": "ml.geospatial.interactive", + "supportedImageNames": [ + "sagemaker-geospatial-v1-0" + ], + "vcpuNum": 0 + }, + { + "_defaultOrder": 21, + "_isFastLaunch": true, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 4, + "name": "ml.c5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 22, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 8, + "name": "ml.c5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 23, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.c5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 24, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.c5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 25, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 72, + "name": "ml.c5.9xlarge", + "vcpuNum": 36 + }, + { + "_defaultOrder": 26, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 96, + "name": "ml.c5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 27, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 144, + "name": "ml.c5.18xlarge", + "vcpuNum": 72 + }, + { + "_defaultOrder": 28, + "_isFastLaunch": false, + "category": "Compute optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.c5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 29, + "_isFastLaunch": true, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.g4dn.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 30, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.g4dn.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 31, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.g4dn.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 32, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.g4dn.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 33, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.g4dn.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 34, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.g4dn.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 35, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 61, + "name": "ml.p3.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 36, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 244, + "name": "ml.p3.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 37, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 488, + "name": "ml.p3.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 38, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.p3dn.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 39, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.r5.large", + "vcpuNum": 2 + }, + { + "_defaultOrder": 40, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.r5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 41, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.r5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 42, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.r5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 43, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.r5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 44, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.r5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 45, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 512, + "name": "ml.r5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 46, + "_isFastLaunch": false, + "category": "Memory Optimized", + "gpuNum": 0, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.r5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 47, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 16, + "name": "ml.g5.xlarge", + "vcpuNum": 4 + }, + { + "_defaultOrder": 48, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 32, + "name": "ml.g5.2xlarge", + "vcpuNum": 8 + }, + { + "_defaultOrder": 49, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 64, + "name": "ml.g5.4xlarge", + "vcpuNum": 16 + }, + { + "_defaultOrder": 50, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 128, + "name": "ml.g5.8xlarge", + "vcpuNum": 32 + }, + { + "_defaultOrder": 51, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 1, + "hideHardwareSpecs": false, + "memoryGiB": 256, + "name": "ml.g5.16xlarge", + "vcpuNum": 64 + }, + { + "_defaultOrder": 52, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 192, + "name": "ml.g5.12xlarge", + "vcpuNum": 48 + }, + { + "_defaultOrder": 53, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 4, + "hideHardwareSpecs": false, + "memoryGiB": 384, + "name": "ml.g5.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 54, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 768, + "name": "ml.g5.48xlarge", + "vcpuNum": 192 + }, + { + "_defaultOrder": 55, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 1152, + "name": "ml.p4d.24xlarge", + "vcpuNum": 96 + }, + { + "_defaultOrder": 56, + "_isFastLaunch": false, + "category": "Accelerated computing", + "gpuNum": 8, + "hideHardwareSpecs": false, + "memoryGiB": 1152, + "name": "ml.p4de.24xlarge", + "vcpuNum": 96 + } + ], + "instance_type": "ml.t3.medium", + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.8" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/06_CodeGeneration/README.md b/06_CodeGeneration/README.md new file mode 100644 index 00000000..0dde55a1 --- /dev/null +++ b/06_CodeGeneration/README.md @@ -0,0 +1,37 @@ +# Lab 6 - Code Generation + +## Overview + +In this lab, you will learn to use LLMs on Amazon Bedrock for code generation, SQL query creation, code explanation, and code translation across languages. We will demo Bedrock's API (boto3) as well as its integration with LangChain. + +First, we will generate Python code and SQL queries by providing context about a dataset. Next, we will explain code and translate between languages. We will explore these use cases with both the Bedrock API directly and via LangChain integration. + +## Audience + +Architects and developers who want to learn how to use Amazon Bedrock LLMs to generate, explain and translate code. + +Some of the business use cases for code generation include: + +- Code Translation +- Code Explain and Reviews +- Database or SQL query generation +- Rapid Prototyping +- Issue Identification +- Bug Fixing +- Code Optimization + +## Workshop Notebooks + +1. [Code Generation](./00_code_generatation_w_bedrock.ipynb)- Demonstrates how to generate Python code using Natural language. It shows examples of prompting to generate simple functions, classes, and full programs in Python for Data Analyst to perform sales analysis on a given Sales CSV dataset. + +2. [Database or SQL Query Generation](./01_sql_query_generate_w_bedrock.ipynb) - Focuses on generating SQL queries with Amazon Bedrock APIs. It includes examples of generating both simple and complex SQL statements for a given data set and database schema. + +3. [Code Explanation](./02_code_interpret_w_langchain.ipynb) - Uses Bedrock's foundation models to generate explanations for complex C++ code snippets. It shows how to carefully craft prompts to get the model to generate comments and documentation that explain the functionality and logic of complicated C++ code examples. Prompts can be easily updated for another programming languages. + +4. [Code Translation ](./03_code_translate_w_langchain.ipynb) - Guides you through translating C++ code to Java using Amazon Bedrock and LangChain APIs. It shows techniques for prompting the model to port C++ code over to Java, handling differences in syntax, language constructs, and conventions between the languages. + + +## Architecture + +![Bedrock](./images/bedrock-code-gen.png) +![Bedrock](./images/bedrock-code-gen-langchain.png) \ No newline at end of file diff --git a/06_CodeGeneration/images/bedrock-code-gen-langchain.png b/06_CodeGeneration/images/bedrock-code-gen-langchain.png new file mode 100644 index 00000000..d829e6fd Binary files /dev/null and b/06_CodeGeneration/images/bedrock-code-gen-langchain.png differ diff --git a/06_CodeGeneration/images/bedrock-code-gen.png b/06_CodeGeneration/images/bedrock-code-gen.png new file mode 100644 index 00000000..3457eac2 Binary files /dev/null and b/06_CodeGeneration/images/bedrock-code-gen.png differ diff --git a/06_CodeGeneration/sales.csv b/06_CodeGeneration/sales.csv new file mode 100644 index 00000000..6f89b0af --- /dev/null +++ b/06_CodeGeneration/sales.csv @@ -0,0 +1,26 @@ +date,product_id,price,units_sold +2023-01-01,P001,50,20 +2023-01-02,P002,60,15 +2023-01-03,P001,50,18 +2023-01-04,P003,70,30 +2023-01-05,P001,50,25 +2023-01-06,P002,60,22 +2023-01-07,P003,70,24 +2023-01-08,P001,50,28 +2023-01-09,P002,60,17 +2023-01-10,P003,70,29 +2023-02-11,P001,50,23 +2023-02-12,P002,60,19 +2023-02-13,P001,50,21 +2023-02-14,P003,70,31 +2023-03-15,P001,50,26 +2023-03-16,P002,60,20 +2023-03-17,P003,70,33 +2023-04-18,P001,50,27 +2023-04-19,P002,60,18 +2023-04-20,P003,70,32 +2023-04-21,P001,50,22 +2023-04-22,P002,60,16 +2023-04-23,P003,70,34 +2023-05-24,P001,50,24 +2023-05-25,P002,60,21 \ No newline at end of file diff --git a/README.md b/README.md index 92e08cb8..469970a6 100644 --- a/README.md +++ b/README.md @@ -13,6 +13,7 @@ Labs include: - **Questions Answering** \[Estimated time to complete - 45 mins\] - **Chatbot** \[Estimated time to complete - 45 mins\] - **Image Generation** \[Estimated time to complete - 30 mins\] +- **Code Generation** \[Estimated time to complete - 30 mins\]
@@ -117,3 +118,13 @@ This repository contains notebook examples for the Bedrock Architecture Patterns ### Text to Image - [Image Generation with Stable Diffusion](./05_Image/Bedrock%20Stable%20Diffusion%20XL.ipynb): This notebook demonstrates image generation with using the Stable Diffusion model + +### Code Generation, SQL Generation, Code Translation and Explanation + +1. [Code Generation](./06_CodeGeneration/00_code_generatation_w_bedrock.ipynb)- Demonstrates how to generate Python code using Natural language. It shows examples of prompting to generate simple functions, classes, and full programs in Python for Data Analyst to perform sales analysis on a given Sales CSV dataset. + +2. [Database or SQL Query Generation](./06_CodeGeneration/01_sql_query_generate_w_bedrock.ipynb) - Focuses on generating SQL queries with Amazon Bedrock APIs. It includes examples of generating both simple and complex SQL statements for a given data set and database schema. + +3. [Code Explanation](./06_CodeGeneration/02_code_interpret_w_langchain.ipynb) - Uses Bedrock's foundation models to generate explanations for complex C++ code snippets. It shows how to carefully craft prompts to get the model to generate comments and documentation that explain the functionality and logic of complicated C++ code examples. Prompts can be easily updated for another programming languages. + +4. [Code Translation ](./06_CodeGeneration/03_code_translate_w_langchain.ipynb) - Guides you through translating C++ code to Java using Amazon Bedrock and LangChain APIs. It shows techniques for prompting the model to port C++ code over to Java, handling differences in syntax, language constructs, and conventions between the languages.