-
Notifications
You must be signed in to change notification settings - Fork 5.7k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Use GenAI package for google (#17939)
- Loading branch information
Showing
19 changed files
with
2,321 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2 changes: 2 additions & 0 deletions
2
llama-index-integrations/llms/llama-index-llms-gemini/README.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
153 changes: 153 additions & 0 deletions
153
llama-index-integrations/llms/llama-index-llms-google-genai/.gitignore
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,153 @@ | ||
llama_index/_static | ||
.DS_Store | ||
# Byte-compiled / optimized / DLL files | ||
__pycache__/ | ||
*.py[cod] | ||
*$py.class | ||
|
||
# C extensions | ||
*.so | ||
|
||
# Distribution / packaging | ||
.Python | ||
bin/ | ||
build/ | ||
develop-eggs/ | ||
dist/ | ||
downloads/ | ||
eggs/ | ||
.eggs/ | ||
etc/ | ||
include/ | ||
lib/ | ||
lib64/ | ||
parts/ | ||
sdist/ | ||
share/ | ||
var/ | ||
wheels/ | ||
pip-wheel-metadata/ | ||
share/python-wheels/ | ||
*.egg-info/ | ||
.installed.cfg | ||
*.egg | ||
MANIFEST | ||
|
||
# PyInstaller | ||
# Usually these files are written by a python script from a template | ||
# before PyInstaller builds the exe, so as to inject date/other infos into it. | ||
*.manifest | ||
*.spec | ||
|
||
# Installer logs | ||
pip-log.txt | ||
pip-delete-this-directory.txt | ||
|
||
# Unit test / coverage reports | ||
htmlcov/ | ||
.tox/ | ||
.nox/ | ||
.coverage | ||
.coverage.* | ||
.cache | ||
nosetests.xml | ||
coverage.xml | ||
*.cover | ||
*.py,cover | ||
.hypothesis/ | ||
.pytest_cache/ | ||
.ruff_cache | ||
|
||
# Translations | ||
*.mo | ||
*.pot | ||
|
||
# Django stuff: | ||
*.log | ||
local_settings.py | ||
db.sqlite3 | ||
db.sqlite3-journal | ||
|
||
# Flask stuff: | ||
instance/ | ||
.webassets-cache | ||
|
||
# Scrapy stuff: | ||
.scrapy | ||
|
||
# Sphinx documentation | ||
docs/_build/ | ||
|
||
# PyBuilder | ||
target/ | ||
|
||
# Jupyter Notebook | ||
.ipynb_checkpoints | ||
notebooks/ | ||
|
||
# IPython | ||
profile_default/ | ||
ipython_config.py | ||
|
||
# pyenv | ||
.python-version | ||
|
||
# pipenv | ||
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. | ||
# However, in case of collaboration, if having platform-specific dependencies or dependencies | ||
# having no cross-platform support, pipenv may install dependencies that don't work, or not | ||
# install all needed dependencies. | ||
#Pipfile.lock | ||
|
||
# PEP 582; used by e.g. github.com/David-OConnor/pyflow | ||
__pypackages__/ | ||
|
||
# Celery stuff | ||
celerybeat-schedule | ||
celerybeat.pid | ||
|
||
# SageMath parsed files | ||
*.sage.py | ||
|
||
# Environments | ||
.env | ||
.venv | ||
env/ | ||
venv/ | ||
ENV/ | ||
env.bak/ | ||
venv.bak/ | ||
pyvenv.cfg | ||
|
||
# Spyder project settings | ||
.spyderproject | ||
.spyproject | ||
|
||
# Rope project settings | ||
.ropeproject | ||
|
||
# mkdocs documentation | ||
/site | ||
|
||
# mypy | ||
.mypy_cache/ | ||
.dmypy.json | ||
dmypy.json | ||
|
||
# Pyre type checker | ||
.pyre/ | ||
|
||
# Jetbrains | ||
.idea | ||
modules/ | ||
*.swp | ||
|
||
# VsCode | ||
.vscode | ||
|
||
# pipenv | ||
Pipfile | ||
Pipfile.lock | ||
|
||
# pyright | ||
pyrightconfig.json |
1 change: 1 addition & 0 deletions
1
llama-index-integrations/llms/llama-index-llms-google-genai/BUILD
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
poetry_requirements(name="poetry", module_mapping={"google-genai": ["google"]}) |
1 change: 1 addition & 0 deletions
1
llama-index-integrations/llms/llama-index-llms-google-genai/CHANGELOG.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
# CHANGELOG — llama-index-llms-genai |
17 changes: 17 additions & 0 deletions
17
llama-index-integrations/llms/llama-index-llms-google-genai/Makefile
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,17 @@ | ||
GIT_ROOT ?= $(shell git rev-parse --show-toplevel) | ||
|
||
help: ## Show all Makefile targets. | ||
@grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[33m%-30s\033[0m %s\n", $$1, $$2}' | ||
|
||
format: ## Run code autoformatters (black). | ||
pre-commit install | ||
git ls-files | xargs pre-commit run black --files | ||
|
||
lint: ## Run linters: pre-commit (black, ruff, codespell) and mypy | ||
pre-commit install && git ls-files | xargs pre-commit run --show-diff-on-failure --files | ||
|
||
test: ## Run tests via pytest. | ||
pytest tests | ||
|
||
watch-docs: ## Build and watch documentation. | ||
sphinx-autobuild docs/ docs/_build/html --open-browser --watch $(GIT_ROOT)/llama_index/ |
112 changes: 112 additions & 0 deletions
112
llama-index-integrations/llms/llama-index-llms-google-genai/README.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,112 @@ | ||
# LlamaIndex Llms Integration: Gemini | ||
|
||
## Installation | ||
|
||
1. Install the required Python packages: | ||
|
||
```bash | ||
%pip install llama-index-llms-genai | ||
!pip install -q llama-index google-genai | ||
``` | ||
|
||
2. Set the Google API key as an environment variable: | ||
|
||
```bash | ||
%env GOOGLE_API_KEY=your_api_key_here | ||
``` | ||
|
||
## Usage | ||
|
||
### Basic Content Generation | ||
|
||
To generate a poem using the Gemini model, use the following code: | ||
|
||
```python | ||
from llama_index.llms.genai import Gemini | ||
|
||
resp = Gemini().complete("Write a poem about a magic backpack") | ||
print(resp) | ||
``` | ||
|
||
### Chat with Messages | ||
|
||
To simulate a conversation, send a list of messages: | ||
|
||
```python | ||
from llama_index.core.llms import ChatMessage | ||
from llama_index.llms.genai import Gemini | ||
|
||
messages = [ | ||
ChatMessage(role="user", content="Hello friend!"), | ||
ChatMessage(role="assistant", content="Yarr what is shakin' matey?"), | ||
ChatMessage( | ||
role="user", content="Help me decide what to have for dinner." | ||
), | ||
] | ||
resp = Gemini().chat(messages) | ||
print(resp) | ||
``` | ||
|
||
### Streaming Responses | ||
|
||
To stream content responses in real-time: | ||
|
||
```python | ||
from llama_index.llms.genai import Gemini | ||
|
||
llm = Gemini() | ||
resp = llm.stream_complete( | ||
"The story of Sourcrust, the bread creature, is really interesting. It all started when..." | ||
) | ||
for r in resp: | ||
print(r.text, end="") | ||
``` | ||
|
||
To stream chat responses: | ||
|
||
```python | ||
from llama_index.llms.genai import Gemini | ||
from llama_index.core.llms import ChatMessage | ||
|
||
llm = Gemini() | ||
messages = [ | ||
ChatMessage(role="user", content="Hello friend!"), | ||
ChatMessage(role="assistant", content="Yarr what is shakin' matey?"), | ||
ChatMessage( | ||
role="user", content="Help me decide what to have for dinner." | ||
), | ||
] | ||
resp = llm.stream_chat(messages) | ||
``` | ||
|
||
### Specific Model Usage | ||
|
||
To use a specific model, you can configure it like this: | ||
|
||
```python | ||
from llama_index.llms.genai import Gemini | ||
|
||
llm = Gemini(model="models/gemini-pro") | ||
resp = llm.complete("Write a short, but joyous, ode to LlamaIndex") | ||
print(resp) | ||
``` | ||
|
||
### Asynchronous API | ||
|
||
To use the asynchronous completion API: | ||
|
||
```python | ||
from llama_index.llms.genai import Gemini | ||
|
||
llm = Gemini() | ||
resp = await llm.acomplete("Llamas are famous for ") | ||
print(resp) | ||
``` | ||
|
||
For asynchronous streaming of responses: | ||
|
||
```python | ||
resp = await llm.astream_complete("Llamas are famous for ") | ||
async for chunk in resp: | ||
print(chunk.text, end="") | ||
``` |
6 changes: 6 additions & 0 deletions
6
...index-integrations/llms/llama-index-llms-google-genai/llama_index/llms/google_genai/BUILD
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
python_sources() | ||
|
||
resource( | ||
name="py_typed", | ||
source="py.typed", | ||
) |
3 changes: 3 additions & 0 deletions
3
...integrations/llms/llama-index-llms-google-genai/llama_index/llms/google_genai/__init__.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
from llama_index.llms.google_genai.base import GoogleGenAI | ||
|
||
__all__ = ["GoogleGenAI"] |
Oops, something went wrong.