- SQLova is a neural semantic parser translating natural language utterance to SQL query. The name is originated from the name of our department: Search & QLova (Search & Clova).
- Wonseok Hwang, Jinyeong Yim, Seunghyun Park, and Minjoon Seo.
- Affiliation: Clova AI Research, NAVER Corp., Seongnam, Korea.
- Technical report.
- We present the new state-of-the-art semantic parsing model that translates a natural language (NL) utterance into a SQL query.
- The model is evaluated on WikiSQL, a semantic parsing dataset consisting of 80,654 (NL, SQL) pairs over 24,241 tables from Wikipedia.
- We achieve 83.6% logical form accuracy and 89.6% execution accuracy on WikiSQL test set.
- BERT based table- and context-aware word-embedding.
- The sequence-to-SQL model leveraging recent works (Seq2SQL, SQLNet).
- Execution-guided decoding is applied in SQLova-EG.
Model | Dev logical form accuracy |
Dev execution accuracy |
Test logical form accuracy |
Test execution accuracy |
---|---|---|---|---|
SQLova | 81.6 (+5.5)^ | 87.2 (+3.2)^ | 80.7 (+5.3)^ | 86.2 (+2.5)^ |
SQLova-EG | 84.2 (+8.2)* | 90.2 (+3.0)* | 83.6(+8.2)* | 89.6 (+2.5)* |
- ^: Compared to current SOTA models that do not use execution guided decoding.
- *: Compared to current SOTA.
- The order of where conditions is ignored in measuring logical form accuracy in our model.
python3.6
or higher.PyTorch 0.4.0
or higher.CUDA 9.0
- Python libraries:
babel, matplotlib, defusedxml, tqdm
- Example
- Install minicoda
conda install pytorch torchvision -c pytorch
conda install -c conda-forge records
conda install babel
conda install matplotlib
conda install defusedxml
conda install tqdm
- The code has been tested on Tesla M40 GPU running on Ubuntu 16.04.4 LTS.
- Type
python3 train.py --seed 1 --bS 16 --accumulate_gradients 2 --bert_type_abb uS --fine_tune --lr 0.001 --lr_bert 0.00001 --max_seq_leng 222
on terminal.--seed 1
: Set the seed of random generator. The accuracies changes by few percent depending onseed
.--bS 16
: Set the batch size by 16.--accumulate_gradients 2
: Make the effective batch size be16 * 2 = 32
.--bert_type_abb uS
: Uncased-Base BERT model is used. UseuL
to use Uncased-Large BERT.--fine_tune
: Train BERT. Without this, only the sequence-to-SQL module is trained.--lr 0.001
: Set the learning rate of the sequence-to-SQL module as 0.001.--lr_bert 0.00001
: Set the learning rate of BERT module as 0.00001.--max_seq_leng 222
: Set the maximum number of input token lengths of BERT.
- The model should show ~79% logical accuracy (lx) on dev set after ~12 hrs (~10 epochs). Higher accuracy can be obtained with longer training, by selecting different seed, by using Uncased Large BERT model, or by using execution guided decoding.
- Add
--EG
argument while runningtrain.py
to use execution guided decoding. - Whenever higher logical form accuracy calculated on the dev set, following three files are saved on current folder:
model_best.pt
: the checkpoint of the the sequence-to-SQL module.model_bert_best.pt
: the checkpoint of the BERT module.results_dev.jsonl
: json file for official evaluation.
Shallow-Layer
andDecoder-Layer
models can be trained similarly (train_shallow_layer.py
,train_decoer_layer.py
).
- To calculate logical form and execution accuracies on
dev
set using official evaluation script,- Download original WikiSQL dataset.
- tar xvf data.tar.bz2
- Move them under
$HOME/data/WikiSQL-1.1/data
- Set path on
evaluation_ws.py
. This is the file where the path information has added on originalevaluation.py
script. Or you can use originalevaluation.py
by setting the path to the files by yourself. - Type
python3 evaluation_ws.py
on terminal.
- Uncomment line 550-557 of
train.py
to loadtest_loader
andtest_table
. - One
test(...)
function, usetest_loader
andtest_table
instead ofdev_loader
anddev_table
. - Save the output of
test(...)
withsave_for_evaluation(...)
function. - Evaluate with
evaluatoin_ws.py
as before.
- Pretrained SQLova model parameters are uploaded in release. To start from this, uncomment line 562-565 and set paths.
- Pretrained BERT models were downloaded from official repository.
- BERT code is from huggingface-pytorch-pretrained-BERT.
- The sequence-to-SQL model is started from the source code of SQLNet and significantly re-written while maintaining the basic column-attention and sequence-to-set structure of the SQLNet.
- The data is annotated by using
annotate_ws.py
which is based onannotate.py
from WikiSQL repository. The tokens of natural language guery, and the start and end indices of where-conditions on natural language tokens are annotated. - Pre-trained BERT parameters can be downloaded from BERT official repository and can be coverted to
pt
file following instruction from huggingface-pytorch-pretrained-BERT. - For the conveinience, the annotated WikiSQL data and the PyTorch-converted pre-trained BERT parameters are available at here.
Copyright 2019-present NAVER Corp.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.