Skip to content

Redis support #1459

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 6 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 4.3.1
current_version = 4.3.2
commit = true
tag = true
tag_name = {new_version}
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTORS.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,4 +51,4 @@ Contributors
* Zoltan Benedek
* Øyvind Heddeland Instefjord
* Pol Sanlorenzo

* David Colangelo
25 changes: 25 additions & 0 deletions docs/transport.rst
Original file line number Diff line number Diff line change
Expand Up @@ -141,6 +141,31 @@ Another option is to use the InMemoryCache backend. It internally uses a
global dict to store urls with the corresponding content.


If you run your servers in a pool you may wish to share your WSDL cache across multiple servers if they are making
similar calls. To do this you can offload the cache to a shared redis instance by setting a redis cache as follows:

.. code-block:: python
from zeep import Client
from zeep.transports import Transport
from zeep.cache import RedisCache

cache = RedisCache(
redis_host="127.0.0.1",
password="APasswordYouLike",
timeout=60
)

transport = Transport(
cache=cache,
)

client = Client(
'http://www.webservicex.net/ConvertSpeed.asmx?WSDL',
transport=transport)




HTTP Authentication
-------------------
While some providers incorporate security features in the header of a SOAP message,
Expand Down
3 changes: 2 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "zeep"
version = "4.3.1"
version = "4.3.2"
description = "A Python SOAP client"
readme = "README.md"
license = { text = "MIT" }
Expand Down Expand Up @@ -28,6 +28,7 @@ dependencies = [
"requests-toolbelt>=0.7.1",
"requests-file>=1.5.1",
"pytz",
"redis>=5.2.1"
]

[project.urls]
Expand Down
66 changes: 66 additions & 0 deletions src/zeep/cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@
import threading
from contextlib import contextmanager
from typing import Dict, Tuple, Union
import redis
import json

import platformdirs
import pytz
Expand Down Expand Up @@ -163,6 +165,70 @@ def get(self, url):
return self._decode_data(data)
logger.debug("Cache MISS for %s", url)

class RedisCache(Base):
"""Cache contents via a redis database
- This is helpful if you make zeep calls from a pool of servers that need to share a common cache
"""

def __init__(self, redis_host, password, port=6379, timeout=3600, health_check_interval=10, socket_timeout=5, retry_on_timeout=True, single_connection_client=True):
self._timeout = timeout
self._redis_host = redis_host

self._redis_client = redis.StrictRedis(
host=redis_host,
port=port,
password=password,
health_check_interval=health_check_interval,
socket_timeout=socket_timeout,
retry_on_timeout=retry_on_timeout,
single_connection_client = single_connection_client
)

def add(self, url, content):
logger.debug("Caching contents of %s", url)
# Remove the cached key
self._redis_client.delete(url)

try:
# Stringify the data and add the time so we know when it was written
data = json.dumps({
'time': datetime.datetime.now(datetime.timezone.utc).isoformat(),
'value': base64.b64encode(content).decode('utf-8')
})

# add the new cache response for the url
self._redis_client.set(url, value=data)
except Exception as e:
logger.debug("Could not cache contents of %s", url)
logger.debug(e)

def get(self, url):

try:
value = self._redis_client.get(url)
if value is None:
logger.debug("Cache MISS for %s", url)
return None

cached_value = json.loads(value)
except Exception as e:
logger.debug("Could not extract from cache contents of %s", url)
logger.debug(e)
# if we cant decode it just return none
return None

if cached_value is not None and not _is_expired(datetime.datetime.fromisoformat(cached_value['time']), self._timeout):
logger.debug("Cache HIT for %s", url)
value = cached_value.get('value', None)
if value is not None:
return base64.b64decode(value)
else:
return None
else:
logger.debug("Cache MISS for %s", url)
return None



def _is_expired(value, timeout):
"""Return boolean if the value is expired"""
Expand Down