You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi!
I would like to ask if it is possible to turn off logging or change the logging level from python script that uses nlu library?
Even simple 'import nlu' generates lines of logs, loading models there are tons of them...
Before importing nlu, I am trying to create pyspark context and set desired log level as it is pointed in logs from import nlu: Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel)
But it doesn't seem to help, actually the opposite, I can't load models and do the predictions then...
The other approach was setting logging levels for all possible loggers: nlu, py4j, py4j.java_gateway to CRITICAL in my case
But it also didn't help.
There are still messages from e.g. WARN SparkSession$Builder, WARN ApacheUtils, I tensorflow/core/platform/cpu_feature_guard.cc:142], etc...
The text was updated successfully, but these errors were encountered:
Hi!
I would like to ask if it is possible to turn off logging or change the logging level from python script that uses nlu library?
Even simple 'import nlu' generates lines of logs, loading models there are tons of them...
Before importing nlu, I am trying to create pyspark context and set desired log level as it is pointed in logs from import nlu: Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel)
But it doesn't seem to help, actually the opposite, I can't load models and do the predictions then...
The other approach was setting logging levels for all possible loggers: nlu, py4j, py4j.java_gateway to CRITICAL in my case
But it also didn't help.
There are still messages from e.g. WARN SparkSession$Builder, WARN ApacheUtils, I tensorflow/core/platform/cpu_feature_guard.cc:142], etc...
The text was updated successfully, but these errors were encountered: