Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes for issues #11 and #9 and installation instructions for docker #14

Open
wants to merge 12 commits into
base: bokeh
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -127,3 +127,6 @@ dmypy.json

# Pyre type checker
.pyre/

# Swap files
*.swp
27 changes: 25 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,12 +19,23 @@ pip install -r /path/to/freqtrade_analysis_notebook/requirements.txt

### Docker

This has not been tested in any docker environments, so YMMV, but some instructions that might be useful are in [this issue](https://github.com/froggleston/freqtrade_analysis_notebook/issues/1)
This has been tested under docker by @vaidab .

Use the instructions [here](https://www.freqtrade.io/en/stable/data-analysis/) for setting up Jupyter for freqtrade under docker.

Some instructions that might be useful are in [this issue](https://github.com/froggleston/freqtrade_analysis_notebook/issues/1)


## Installation

Follow one of the two methods below:

### Docker installation

- Copy `.py` files and the `RollingBacktestNotebook.ipynb` to user_data/notebooks.
- Set `freqtrade_dir` variable to `"/freqtrade"`


### Easiest installation

- Copy all `.py` files and the `RollingBacktestNotebook.ipynb` into your base freqtrade folder, **not** the user_data/notebooks folder.
Expand Down Expand Up @@ -67,6 +78,10 @@ Pick any of these to open up a new Jupyter file browser tab in your preferred br

If using an IDE like vscode, install an available jupyter extension and open the freqtrade folder. Then open the ipynb file and run the cells as normal.

### Via Docker

Connect to JupyterLab by following the instructions on the Freqtrade site.

## Usage

Use the toolbar at the top of the plot to change behaviour or select/deselect data series.
Expand Down Expand Up @@ -94,4 +109,12 @@ Mouseover main plot and subplot data series to see individual values.
### plotly

- works but slow
- not as fancy
- not as fancy

### "No data found. Terminating." when running backtests

Usually happens if you are using a different data_format than the one downloaded.

### "IOPub data rate exceeded." when running backtests

Lower the number of pairs in your pairlist.
97 changes: 71 additions & 26 deletions RollingBacktestNotebook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,16 +7,19 @@
"source": [
"# FreqTrade backtesting analysis and plotting notebook\n",
"\n",
"Version: v1.1\n",
"Version: v1.2\n",
"\n",
"Date: 2023-07-08\n",
"Date: 2023-07-22\n",
"\n",
"GitHub: https://github.com/froggleston/freqtrade_analysis_notebook\n",
"\n",
"Authors:\n",
"* **@froggleston** (for the notebook and some of the notebook_helper code)\n",
"* **@rk** (for most of the notebook_helper code)\n",
"\n",
"Contributor:\n",
"* **@vaidab** (various bugfixes and docker setup & testing)\n",
"\n",
"### Welcome to the Notebook\n",
"\n",
"This notebook should make it easier to run sequential test backtests and compare them against a single benchmark backtest.\n",
Expand Down Expand Up @@ -117,16 +120,18 @@
"## main configuration for everything ##\n",
"\n",
"# main freqtrade dir\n",
"# use '.' if you copy the notebook files into your ft dir\n",
"freqtrade_dir = \".\"\n",
"\n",
"# If using docker:\n",
"freqtrade_dir = \".\" # use '.' if you've copied the notebook files into your ft dir\n",
"# freqtrade_dir = \"/freqtrade\" # if using docker\n",
"\n",
"if 'executed' not in globals():\n",
" executed = True # Guard against running this multiple times\n",
" cwd = os.getcwd() # equivalent to !pwd\n",
" # cwd = cwd[0] # only use if using !pwd above\n",
" sys.path.append(cwd) # Add notebook dir to python path for utility imports\n",
" # cd to root directory to make relative paths in config valid\n",
" %cd freqtrade_dir\n",
" os.chdir(\"/freqtrade/\")\n",
"\n",
"# parallelise backtests by month\n",
"parallel = True\n",
Expand Down Expand Up @@ -158,13 +163,13 @@
"# # set your config file\n",
"config_file = f\"{freqtrade_dir}/your_config.json\"\n",
"\n",
"# # set your format and path to downloaded data\n",
"# set your format and path to downloaded data\n",
"# data_format = \"json\"\n",
"# data_location = Path('/path', 'to', 'your', 'data', f'{exchange}')\n",
"# data_location = Path(freqtrade_dir, 'path', 'to', 'your', 'data', f'{exchange}')\n",
"\n",
"# set your stake currency and stake format\n",
"stake_currency = \"USDT\"\n",
" \n",
"\n",
"# set your chosen stoploss\n",
"stoploss = -0.125\n",
"\n",
Expand All @@ -183,7 +188,6 @@
"# uncomment for 1m detail\n",
"# timeframe_detail = \"1m\"\n",
"# ft_config['timeframe_detail'] = timeframe_detail\n",
"\n",
"ft_config['datadir'] = data_location\n",
"\n",
"if short:\n",
Expand All @@ -205,6 +209,22 @@
" 'dataformat_ohlcv':data_format,\n",
" 'stoploss': stoploss,\n",
" 'minimal_roi': minimal_roi,\n",
" \"entry_pricing\": {\n",
" \"price_side\": \"same\",\n",
" \"use_order_book\": True,\n",
" \"order_book_top\": 1,\n",
" \"price_last_balance\": 0.0,\n",
" \"check_depth_of_market\": {\n",
" \"enabled\": False,\n",
" \"bids_to_ask_delta\": 1\n",
" }\n",
" },\n",
" \"exit_pricing\": {\n",
" \"price_side\": \"same\",\n",
" \"use_order_book\": True,\n",
" \"order_book_top\": 1,\n",
" \"price_last_balance\": 0.0\n",
" } \n",
" }\n",
"else:\n",
" trading_mode = CandleType.SPOT\n",
Expand All @@ -224,6 +244,22 @@
" 'dataformat_ohlcv':data_format,\n",
" 'minimal_roi': minimal_roi,\n",
" 'enable_protections': enable_protections,\n",
" \"entry_pricing\": {\n",
" \"price_side\": \"same\",\n",
" \"use_order_book\": True,\n",
" \"order_book_top\": 1,\n",
" \"price_last_balance\": 0.0,\n",
" \"check_depth_of_market\": {\n",
" \"enabled\": False,\n",
" \"bids_to_ask_delta\": 1\n",
" }\n",
" },\n",
" \"exit_pricing\": {\n",
" \"price_side\": \"same\",\n",
" \"use_order_book\": True,\n",
" \"order_book_top\": 1,\n",
" \"price_last_balance\": 0.0\n",
" }\n",
" }\n",
"\n",
"bench_config = {\n",
Expand Down Expand Up @@ -261,8 +297,7 @@
"metadata": {
"pycharm": {
"name": "#%%t\n"
},
"scrolled": false
}
},
"outputs": [],
"source": [
Expand Down Expand Up @@ -326,9 +361,7 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"scrolled": false
},
"metadata": {},
"outputs": [],
"source": [
"# run the test strategy for the same timerange as benchmark \n",
Expand Down Expand Up @@ -537,14 +570,19 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"scrolled": false
"editable": true,
"slideshow": {
"slide_type": ""
},
"tags": []
},
"outputs": [],
"source": [
"import sqlite3\n",
"from freqtrade.data.btanalysis import load_backtest_data, load_backtest_stats\n",
"\n",
"# specify your actual trades from a dry/live DB\n",
"# db_path = f\"{freqtrade_dir}/user_data/your_db_name.sqlite\"\n",
"db_path = \"your_db_name.sqlite\"\n",
"\n",
"# if backtest_dir points to a directory, it'll automatically load the last backtest file\n",
Expand All @@ -553,7 +591,14 @@
"# or specify a specific backtest results file\n",
"# backtest_dir = config[\"user_data_dir\"] / \"backtest_results/backtest-result-2020-07-01_20-04-22.json\"\n",
"\n",
"dat = sqlite3.connect(db_path)\n",
"try:\n",
" if os.path.exists(db_path):\n",
" dat = sqlite3.connect(db_path)\n",
" else:\n",
" raise FileNotFoundError(db_path + \" does not exist\")\n",
"except sqlite3.Error as e:\n",
" print(f\"An error occurred connecting to the database: {e}\")\n",
"\n",
"sel_cols = \"pair,open_date,close_date,min_rate,max_rate,enter_tag,exit_reason,open_rate,close_rate,close_profit,close_profit_abs\"\n",
"query = dat.execute(f\"SELECT {sel_cols} FROM trades\")\n",
"cols = [column[0] for column in query.description]\n",
Expand Down Expand Up @@ -590,8 +635,12 @@
"bt_trades = bt_trades.loc[(bt_trades['open_date'] >= compare_start_date) & (bt_trades['open_date'] <= compare_end_date)]\n",
"sql_trades = sql_trades.loc[(sql_trades['open_date'] <= bt_end_date)]\n",
"\n",
"print(\"REAL\\n\", tabulate(sql_trades, headers='keys', tablefmt='psql', showindex=True))\n",
"print(\"BT\\n\", tabulate(bt_trades[sel_cols.split(\",\")], headers='keys', tablefmt='psql', showindex=True))\n",
"print(f\"Trades that match both timeranges: {sql_trades.shape[0]}/{num_real_trades} trades in the dry/live DB and {bt_trades.shape[0]}/{num_bt_trades} trades in the backtest file.\")\n",
"if sql_trades.shape[0] == 0 or bt_trades.shape[0] == 0:\n",
" raise Exception(\"No matching trades, check if backtesting period matches dry/live run\")\n",
"\n",
"print(\"DRY/LIVE\\n\", tabulate(sql_trades, headers='keys', tablefmt='psql', showindex=True))\n",
"print(\"BACKTESTING\\n\", tabulate(bt_trades[sel_cols.split(\",\")], headers='keys', tablefmt='psql', showindex=True))\n",
"\n",
"merged_df = pd.merge(sql_trades, bt_trades, how ='outer', on =['pair', 'open_date'], suffixes=('_sql', '_bt')).sort_values(by='open_date')\n",
"\n",
Expand Down Expand Up @@ -708,9 +757,7 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"scrolled": false
},
"metadata": {},
"outputs": [],
"source": [
"enter_tags=\"all\"\n",
Expand Down Expand Up @@ -743,9 +790,7 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"scrolled": false
},
"metadata": {},
"outputs": [],
"source": [
"# construct the plotting framework\n",
Expand Down Expand Up @@ -917,7 +962,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.10"
"version": "3.11.4"
},
"vscode": {
"interpreter": {
Expand All @@ -926,5 +971,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 1
"nbformat_minor": 4
}