Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | 
         @@ -1,66 +1,67 @@ 
     | 
|
| 1 | 
         
            -
            ---
         
     | 
| 2 | 
         
            -
            language: en
         
     | 
| 3 | 
         
            -
            tags:
         
     | 
| 4 | 
         
            -
            - tapex
         
     | 
| 5 | 
         
            -
             
     | 
| 6 | 
         
            -
             
     | 
| 7 | 
         
            -
             
     | 
| 8 | 
         
            -
             
     | 
| 9 | 
         
            -
             
     | 
| 10 | 
         
            -
             
     | 
| 11 | 
         
            -
             
     | 
| 12 | 
         
            -
             
     | 
| 13 | 
         
            -
             
     | 
| 14 | 
         
            -
             
     | 
| 15 | 
         
            -
             
     | 
| 16 | 
         
            -
             
     | 
| 17 | 
         
            -
             
     | 
| 18 | 
         
            -
             
     | 
| 19 | 
         
            -
             
     | 
| 20 | 
         
            -
             
     | 
| 21 | 
         
            -
             
     | 
| 22 | 
         
            -
             
     | 
| 23 | 
         
            -
             
     | 
| 24 | 
         
            -
             
     | 
| 25 | 
         
            -
             
     | 
| 26 | 
         
            -
             
     | 
| 27 | 
         
            -
             
     | 
| 28 | 
         
            -
            import  
     | 
| 29 | 
         
            -
             
     | 
| 30 | 
         
            -
             
     | 
| 31 | 
         
            -
             
     | 
| 32 | 
         
            -
             
     | 
| 33 | 
         
            -
             
     | 
| 34 | 
         
            -
             
     | 
| 35 | 
         
            -
                " 
     | 
| 36 | 
         
            -
             
     | 
| 37 | 
         
            -
             
     | 
| 38 | 
         
            -
             
     | 
| 39 | 
         
            -
             
     | 
| 40 | 
         
            -
             
     | 
| 41 | 
         
            -
             
     | 
| 42 | 
         
            -
             
     | 
| 43 | 
         
            -
             
     | 
| 44 | 
         
            -
             
     | 
| 45 | 
         
            -
             
     | 
| 46 | 
         
            -
             
     | 
| 47 | 
         
            -
             
     | 
| 48 | 
         
            -
             
     | 
| 49 | 
         
            -
             
     | 
| 50 | 
         
            -
             
     | 
| 51 | 
         
            -
             
     | 
| 52 | 
         
            -
             
     | 
| 53 | 
         
            -
             
     | 
| 54 | 
         
            -
             
     | 
| 55 | 
         
            -
             
     | 
| 56 | 
         
            -
             
     | 
| 57 | 
         
            -
             
     | 
| 58 | 
         
            -
             
     | 
| 59 | 
         
            -
             
     | 
| 60 | 
         
            -
                 
     | 
| 61 | 
         
            -
                 
     | 
| 62 | 
         
            -
                 
     | 
| 63 | 
         
            -
                 
     | 
| 64 | 
         
            -
                 
     | 
| 65 | 
         
            -
            }
         
     | 
| 
         | 
|
| 66 | 
         
             
            ```
         
     | 
| 
         | 
|
| 1 | 
         
            +
            ---
         
     | 
| 2 | 
         
            +
            language: en
         
     | 
| 3 | 
         
            +
            tags:
         
     | 
| 4 | 
         
            +
            - tapex
         
     | 
| 5 | 
         
            +
            - table-question-answering
         
     | 
| 6 | 
         
            +
            license: mit
         
     | 
| 7 | 
         
            +
            ---
         
     | 
| 8 | 
         
            +
             
     | 
| 9 | 
         
            +
            # TAPEX (large-sized model) 
         
     | 
| 10 | 
         
            +
             
     | 
| 11 | 
         
            +
            TAPEX was proposed in [TAPEX: Table Pre-training via Learning a Neural SQL Executor](https://arxiv.org/abs/2107.07653) by Qian Liu, Bei Chen, Jiaqi Guo, Morteza Ziyadi, Zeqi Lin, Weizhu Chen, Jian-Guang Lou. The original repo can be found [here](https://github.com/microsoft/Table-Pretraining).
         
     | 
| 12 | 
         
            +
             
     | 
| 13 | 
         
            +
            ## Model description
         
     | 
| 14 | 
         
            +
             
     | 
| 15 | 
         
            +
            TAPEX (**Ta**ble **P**re-training via **Ex**ecution) is a conceptually simple and empirically powerful pre-training approach to empower existing models with *table reasoning* skills. TAPEX realizes table pre-training by learning a neural SQL executor over a synthetic corpus, which is obtained by automatically synthesizing executable SQL queries.
         
     | 
| 16 | 
         
            +
             
     | 
| 17 | 
         
            +
            TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
         
     | 
| 18 | 
         
            +
             
     | 
| 19 | 
         
            +
            ## Intended Uses
         
     | 
| 20 | 
         
            +
             
     | 
| 21 | 
         
            +
            You can use the raw model for simulating neural SQL execution, i.e., employ TAPEX to execute a SQL query on a given table. However, the model is mostly meant to be fine-tuned on a supervised dataset. Currently TAPEX can be fine-tuned to tackle table question answering tasks and table fact verification tasks. See the [model hub](https://huggingface.co/models?search=tapex) to look for fine-tuned versions on a task that interests you.
         
     | 
| 22 | 
         
            +
             
     | 
| 23 | 
         
            +
            ### How to Use
         
     | 
| 24 | 
         
            +
             
     | 
| 25 | 
         
            +
            Here is how to use this model in transformers:
         
     | 
| 26 | 
         
            +
             
     | 
| 27 | 
         
            +
            ```python
         
     | 
| 28 | 
         
            +
            from transformers import TapexTokenizer, BartForConditionalGeneration
         
     | 
| 29 | 
         
            +
            import pandas as pd
         
     | 
| 30 | 
         
            +
             
     | 
| 31 | 
         
            +
            tokenizer = TapexTokenizer.from_pretrained("microsoft/tapex-large-sql-execution")
         
     | 
| 32 | 
         
            +
            model = BartForConditionalGeneration.from_pretrained("microsoft/tapex-large-sql-execution")
         
     | 
| 33 | 
         
            +
             
     | 
| 34 | 
         
            +
            data = {
         
     | 
| 35 | 
         
            +
                "year": [1896, 1900, 1904, 2004, 2008, 2012],
         
     | 
| 36 | 
         
            +
                "city": ["athens", "paris", "st. louis", "athens", "beijing", "london"]
         
     | 
| 37 | 
         
            +
            }
         
     | 
| 38 | 
         
            +
            table = pd.DataFrame.from_dict(data)
         
     | 
| 39 | 
         
            +
             
     | 
| 40 | 
         
            +
            # tapex accepts uncased input since it is pre-trained on the uncased corpus
         
     | 
| 41 | 
         
            +
            query = "select year where city = beijing"
         
     | 
| 42 | 
         
            +
            encoding = tokenizer(table=table, query=query, return_tensors="pt")
         
     | 
| 43 | 
         
            +
             
     | 
| 44 | 
         
            +
            outputs = model.generate(**encoding)
         
     | 
| 45 | 
         
            +
             
     | 
| 46 | 
         
            +
            print(tokenizer.batch_decode(outputs, skip_special_tokens=True))
         
     | 
| 47 | 
         
            +
            # ['2008']
         
     | 
| 48 | 
         
            +
            ```
         
     | 
| 49 | 
         
            +
             
     | 
| 50 | 
         
            +
            ### How to Fine-tuning
         
     | 
| 51 | 
         
            +
             
     | 
| 52 | 
         
            +
            ⚠️ This model checkpoint is **ONLY** used for simulating neural SQL execution (i.e., employ TAPEX to execute a SQL query on a given table), and you **CANNOT** use this model for fine-tuning on downstream tasks. The one that can be used for fine-tuning is at [here](https://huggingface.co/microsoft/tapex-large).
         
     | 
| 53 | 
         
            +
             
     | 
| 54 | 
         
            +
            > This separation of two models for two kinds of intention is because of a known issue in BART large, and we recommend readers to see [this comment](https://github.com/huggingface/transformers/issues/15559#issuecomment-1062880564) for more details.
         
     | 
| 55 | 
         
            +
             
     | 
| 56 | 
         
            +
            ### BibTeX entry and citation info
         
     | 
| 57 | 
         
            +
             
     | 
| 58 | 
         
            +
            ```bibtex
         
     | 
| 59 | 
         
            +
            @inproceedings{
         
     | 
| 60 | 
         
            +
                liu2022tapex,
         
     | 
| 61 | 
         
            +
                title={{TAPEX}: Table Pre-training via Learning a Neural {SQL} Executor},
         
     | 
| 62 | 
         
            +
                author={Qian Liu and Bei Chen and Jiaqi Guo and Morteza Ziyadi and Zeqi Lin and Weizhu Chen and Jian-Guang Lou},
         
     | 
| 63 | 
         
            +
                booktitle={International Conference on Learning Representations},
         
     | 
| 64 | 
         
            +
                year={2022},
         
     | 
| 65 | 
         
            +
                url={https://openreview.net/forum?id=O50443AsCP}
         
     | 
| 66 | 
         
            +
            }
         
     | 
| 67 | 
         
             
            ```
         
     |