@@ -12,13 +12,13 @@ Our ensembling shell script requires a python3.6 virtual enviroment named zenith
deactivate
```
* If you only want to train single models, you can directly install the required librarys by executing
`pip install -r requirements.txt`.
`pip install -r requirements.txt` in `/swp-metaphors/zenith/`.
## 2. Preparing data
First load all files stored in the `zenith/metaphor-detection/data/vua` directory of the external data download into the same directory in this repository.
To prepare the data run `python data_preparation.py vua`. It creates the following files in `/swp-metaphors/zenith/metaphor-detection/data/vua/`:
First load all files stored in the `zenith/metaphor-detection/data/vua/` directory of the external data download into the same directory in this repository.
To prepare the data run `python data_preparation.py vua`. It creates the following files in `/zenith/metaphor-detection/data/vua/`:
*`VUA_corpus_train.csv`, `VUA_corpus_val.csv`, `VUA_corpus_test.csv`: The different dataset splits
*`all_pos_test_tokens.pkl`, `verb_test_tokens.pkl`: Contain offsets of test tokens corresponding to a sentence
@@ -27,7 +27,7 @@ To prepare the data run `python data_preparation.py vua`. It creates the followi
## 3. Training Zenith
1. Navigate to the desired zenith folder in `zenith/metaphor-detection/code`.
1. Navigate to the desired zenith folder in `zenith/metaphor-detection/code/`.
`code/` contains subfolders for the different architectures derived from the original baseline by [Kumar and Sharma](https://github.com/Kumar-Tarun/metaphor-detection). The subfolders are five models named `baseline`, `cn-features`, `concat`, `nb-only` and `glove-only` (MODELNAME) and each consist of four files: