Skip to content

Commit

Permalink
v0.12.23 (#18050)
Browse files Browse the repository at this point in the history
  • Loading branch information
logan-markewich authored Mar 7, 2025
1 parent 54e0a2c commit 4c8d1d6
Show file tree
Hide file tree
Showing 10 changed files with 313 additions and 310 deletions.
55 changes: 55 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,60 @@
# ChangeLog

## [2025-03-07]

### `llama-index-core` [0.12.23]

- added `merging_separator` argument to allow for specifying chunk merge separator in semantic splitter (#18027)
- Add support for running single-agent workflows within the BaseWorkflowAgent class (#18038)
- Fix the error raised when ReactAgent is created without an explicit system message (#18041)
- add a field keep_whitespaces to TokenTextSplitter (#17998)
- do not convert raw tool output to string in AgentWorkflow (#18006)

### `llama-index-embeddings-ollama` [0.6.0]

- feat: add client_kwargs Parameter to OllamaEmbedding Class (#18012)

### `llama-index-llms-anthropic` [0.6.10]

- anthropic caching and thinking updates (#18039)
- allow caching of tool results (#18028)
- support caching of anthropic system prompt (#18008)
- Ensure resuming a workflow actually works (#18023)
- [MarkdownNodeParser] Adding customizable header path separator char (#17964)
- feat: return event instance from run() when stop event is custom (#18001)

### `llama-index-llms-azure-openai` [0.3.2]

- AzureOpenAI: api_base and azure_endpoint are mutually exclusive (#18037)
- Add base_url to AzureOpenAI (#17996)

### `llama-index-llms-bedrock-converse` [0.4.8]

- message text is required in boto3 model (#17989)

### `llama-index-llms-ollama` [0.5.3]

- Make request_timeout in Ollama LLM optional (#18007)

### `llama-index-llms-mistralai` [0.4.0]

- MistralAI support for multImodal content blocks (#17997)

### `llama-index-readers-file` [0.4.6]

- Bugfix: Use `torch.no grad()` in inference in ImageVisionLLMReader when PyTorch is installed (#17970)

### `llama-index-storage-chat-store-mongo` [0.1.0]

- Feat/mongo chat store (#17979)

### `llama-index-core` [0.12.23]

- added `merging_separator` argument to allow for specifying chunk merge separator in semantic splitter (#18027)
- Add support for running single-agent workflows within the BaseWorkflowAgent class (#18038)
- Fix the error raised when ReactAgent is created without an explicit system message (#18041)
- add a field keep_whitespaces to TokenTextSplitter (#17998)

## [2025-02-28]

### `llama-index-core` [0.12.22]
Expand Down
55 changes: 55 additions & 0 deletions docs/docs/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,60 @@
# ChangeLog

## [2025-03-07]

### `llama-index-core` [0.12.23]

- added `merging_separator` argument to allow for specifying chunk merge separator in semantic splitter (#18027)
- Add support for running single-agent workflows within the BaseWorkflowAgent class (#18038)
- Fix the error raised when ReactAgent is created without an explicit system message (#18041)
- add a field keep_whitespaces to TokenTextSplitter (#17998)
- do not convert raw tool output to string in AgentWorkflow (#18006)

### `llama-index-embeddings-ollama` [0.6.0]

- feat: add client_kwargs Parameter to OllamaEmbedding Class (#18012)

### `llama-index-llms-anthropic` [0.6.10]

- anthropic caching and thinking updates (#18039)
- allow caching of tool results (#18028)
- support caching of anthropic system prompt (#18008)
- Ensure resuming a workflow actually works (#18023)
- [MarkdownNodeParser] Adding customizable header path separator char (#17964)
- feat: return event instance from run() when stop event is custom (#18001)

### `llama-index-llms-azure-openai` [0.3.2]

- AzureOpenAI: api_base and azure_endpoint are mutually exclusive (#18037)
- Add base_url to AzureOpenAI (#17996)

### `llama-index-llms-bedrock-converse` [0.4.8]

- message text is required in boto3 model (#17989)

### `llama-index-llms-ollama` [0.5.3]

- Make request_timeout in Ollama LLM optional (#18007)

### `llama-index-llms-mistralai` [0.4.0]

- MistralAI support for multImodal content blocks (#17997)

### `llama-index-readers-file` [0.4.6]

- Bugfix: Use `torch.no grad()` in inference in ImageVisionLLMReader when PyTorch is installed (#17970)

### `llama-index-storage-chat-store-mongo` [0.1.0]

- Feat/mongo chat store (#17979)

### `llama-index-core` [0.12.23]

- added `merging_separator` argument to allow for specifying chunk merge separator in semantic splitter (#18027)
- Add support for running single-agent workflows within the BaseWorkflowAgent class (#18038)
- Fix the error raised when ReactAgent is created without an explicit system message (#18041)
- add a field keep_whitespaces to TokenTextSplitter (#17998)

## [2025-02-28]

### `llama-index-core` [0.12.22]
Expand Down
4 changes: 4 additions & 0 deletions docs/docs/api_reference/storage/chat_store/mongo.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
::: llama_index.storage.chat_store.mongo
options:
members:
- MongoChatStore
3 changes: 3 additions & 0 deletions docs/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -1539,6 +1539,7 @@ nav:
- ./api_reference/storage/chat_store/azurecosmosnosql.md
- ./api_reference/storage/chat_store/dynamodb.md
- ./api_reference/storage/chat_store/index.md
- ./api_reference/storage/chat_store/mongo.md
- ./api_reference/storage/chat_store/postgres.md
- ./api_reference/storage/chat_store/redis.md
- ./api_reference/storage/chat_store/simple.md
Expand Down Expand Up @@ -2402,6 +2403,8 @@ plugins:
- ../llama-index-integrations/retrievers/llama-index-retrievers-tldw
- ../llama-index-integrations/tools/llama-index-tools-valyu
- ../llama-index-integrations/postprocessor/llama-index-postprocessor-ibm
- ../llama-index-integrations/storage/chat_store/llama-index-storage-chat-store-mongo
- ../llama-index-integrations/llms/llama-index-llms-google-genai
- redirects:
redirect_maps:
./api/llama_index.vector_stores.MongoDBAtlasVectorSearch.html: api_reference/storage/vector_store/mongodb.md
Expand Down
2 changes: 1 addition & 1 deletion llama-index-core/llama_index/core/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
"""Init file of LlamaIndex."""

__version__ = "0.12.22"
__version__ = "0.12.23.post2"

import logging
from logging import NullHandler
Expand Down
8 changes: 7 additions & 1 deletion llama-index-core/llama_index/core/readers/file/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,13 @@ def _format_file_timestamp(
if timestamp is None:
return None

timestamp_dt = datetime.fromtimestamp(timestamp, tz=timezone.utc)
# Convert timestamp to UTC
# Check if timestamp is already a datetime object
if isinstance(timestamp, datetime):
timestamp_dt = timestamp.astimezone(timezone.utc)
else:
timestamp_dt = datetime.fromtimestamp(timestamp, tz=timezone.utc)

if include_time:
return timestamp_dt.strftime("%Y-%m-%dT%H:%M:%SZ")
return timestamp_dt.strftime("%Y-%m-%d")
Expand Down
2 changes: 1 addition & 1 deletion llama-index-core/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ name = "llama-index-core"
packages = [{include = "llama_index"}]
readme = "README.md"
repository = "https://github.com/run-llama/llama_index"
version = "0.12.22"
version = "0.12.23.post2"

[tool.poetry.dependencies]
SQLAlchemy = {extras = ["asyncio"], version = ">=1.4.49"}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ readme = "README.md"
version = "0.4.0"

[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
python = ">=3.9,<4.0"
oci = "^2.134.0"
llama-index-core = "^0.12.0"

Expand Down
Loading

0 comments on commit 4c8d1d6

Please sign in to comment.