We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I am using GPU to follow along the training process of the model. I have cloned the github repo, ran the requirements.txt command, installed NLTK punkt, and downloaded spider dataset. Then while executing the "training the parser" part by "exec.py" as follows: (SmBop) rajkrishna@deeplearning-02:~/SmBop$ python3 exec.py, I am getting following error: {'name': 'None', 'force': 'false', 'gpu': '0', 'recover': 'false', 'debug': 'false', 'detect_anomoly': 'false', 'profile': 'false', 'is_oracle': 'false', 'tiny_dataset': 'false', 'load_less': 'false', 'cntx_rep': 'false', 'cntx_beam': 'false', 'disentangle_cntx': 'true', 'cntx_reranker': 'true', 'value_pred': 'true', 'use_longdb': 'true', 'uniquify': 'false', 'use_bce': 'false', 'tfixup': 'false', 'train_as_dev': 'false', 'amp': 'true', 'utt_aug': 'true', 'should_rerank': 'false', 'use_treelstm': 'false', 'db_content': 'true', 'lin_after_cntx': 'false', 'optimizer': 'adam', 'rat_layers': '8', 'beam_size': '30', 'base_dim': '32', 'num_heads': '8', 'beam_encoder_num_layers': '1', 'tree_rep_transformer_num_layers': '1', 'dropout': '0.1', 'rat_dropout': '0.2', 'lm_lr': '3e-06', 'lr': '0.000186', 'batch_size': '20', 'grad_acum': '4', 'max_steps': '60000', 'power': '0.5', 'temperature': '1.0', 'grad_clip': '-1', 'grad_norm': '-1'} experiment_name: jumpy-viridian-guppy Traceback (most recent call last): File "exec.py", line 146, in <module> run() File "exec.py", line 137, in run train_model( File "/home/rajkrishna/.local/lib/python3.8/site-packages/allennlp/commands/train.py", line 226, in train_model training_util.create_serialization_dir(params, serialization_dir, recover, force) File "/home/rajkrishna/.local/lib/python3.8/site-packages/allennlp/training/util.py", line 232, in create_serialization_dir os.makedirs(serialization_dir, exist_ok=True) File "/usr/lib/python3.8/os.py", line 213, in makedirs makedirs(head, exist_ok=exist_ok) File "/usr/lib/python3.8/os.py", line 213, in makedirs makedirs(head, exist_ok=exist_ok) File "/usr/lib/python3.8/os.py", line 213, in makedirs makedirs(head, exist_ok=exist_ok) File "/usr/lib/python3.8/os.py", line 223, in makedirs mkdir(name, mode) PermissionError: [Errno 13] Permission denied: '/media/disk1'
(SmBop) rajkrishna@deeplearning-02:~/SmBop$ python3 exec.py
{'name': 'None', 'force': 'false', 'gpu': '0', 'recover': 'false', 'debug': 'false', 'detect_anomoly': 'false', 'profile': 'false', 'is_oracle': 'false', 'tiny_dataset': 'false', 'load_less': 'false', 'cntx_rep': 'false', 'cntx_beam': 'false', 'disentangle_cntx': 'true', 'cntx_reranker': 'true', 'value_pred': 'true', 'use_longdb': 'true', 'uniquify': 'false', 'use_bce': 'false', 'tfixup': 'false', 'train_as_dev': 'false', 'amp': 'true', 'utt_aug': 'true', 'should_rerank': 'false', 'use_treelstm': 'false', 'db_content': 'true', 'lin_after_cntx': 'false', 'optimizer': 'adam', 'rat_layers': '8', 'beam_size': '30', 'base_dim': '32', 'num_heads': '8', 'beam_encoder_num_layers': '1', 'tree_rep_transformer_num_layers': '1', 'dropout': '0.1', 'rat_dropout': '0.2', 'lm_lr': '3e-06', 'lr': '0.000186', 'batch_size': '20', 'grad_acum': '4', 'max_steps': '60000', 'power': '0.5', 'temperature': '1.0', 'grad_clip': '-1', 'grad_norm': '-1'} experiment_name: jumpy-viridian-guppy Traceback (most recent call last): File "exec.py", line 146, in <module> run() File "exec.py", line 137, in run train_model( File "/home/rajkrishna/.local/lib/python3.8/site-packages/allennlp/commands/train.py", line 226, in train_model training_util.create_serialization_dir(params, serialization_dir, recover, force) File "/home/rajkrishna/.local/lib/python3.8/site-packages/allennlp/training/util.py", line 232, in create_serialization_dir os.makedirs(serialization_dir, exist_ok=True) File "/usr/lib/python3.8/os.py", line 213, in makedirs makedirs(head, exist_ok=exist_ok) File "/usr/lib/python3.8/os.py", line 213, in makedirs makedirs(head, exist_ok=exist_ok) File "/usr/lib/python3.8/os.py", line 213, in makedirs makedirs(head, exist_ok=exist_ok) File "/usr/lib/python3.8/os.py", line 223, in makedirs mkdir(name, mode) PermissionError: [Errno 13] Permission denied: '/media/disk1'
I am also attaching the screenshot:
Kindly help me resolve the error.
The text was updated successfully, but these errors were encountered:
same as mine
Sorry, something went wrong.
I solved this problem by changing the prefix to my own pwd :),like "home/code/smbop/"
No branches or pull requests
I am using GPU to follow along the training process of the model. I have cloned the github repo, ran the requirements.txt command, installed NLTK punkt, and downloaded spider dataset.
Then while executing the "training the parser" part by "exec.py" as follows:
(SmBop) rajkrishna@deeplearning-02:~/SmBop$ python3 exec.py
, I am getting following error:{'name': 'None', 'force': 'false', 'gpu': '0', 'recover': 'false', 'debug': 'false', 'detect_anomoly': 'false', 'profile': 'false', 'is_oracle': 'false', 'tiny_dataset': 'false', 'load_less': 'false', 'cntx_rep': 'false', 'cntx_beam': 'false', 'disentangle_cntx': 'true', 'cntx_reranker': 'true', 'value_pred': 'true', 'use_longdb': 'true', 'uniquify': 'false', 'use_bce': 'false', 'tfixup': 'false', 'train_as_dev': 'false', 'amp': 'true', 'utt_aug': 'true', 'should_rerank': 'false', 'use_treelstm': 'false', 'db_content': 'true', 'lin_after_cntx': 'false', 'optimizer': 'adam', 'rat_layers': '8', 'beam_size': '30', 'base_dim': '32', 'num_heads': '8', 'beam_encoder_num_layers': '1', 'tree_rep_transformer_num_layers': '1', 'dropout': '0.1', 'rat_dropout': '0.2', 'lm_lr': '3e-06', 'lr': '0.000186', 'batch_size': '20', 'grad_acum': '4', 'max_steps': '60000', 'power': '0.5', 'temperature': '1.0', 'grad_clip': '-1', 'grad_norm': '-1'} experiment_name: jumpy-viridian-guppy Traceback (most recent call last): File "exec.py", line 146, in <module> run() File "exec.py", line 137, in run train_model( File "/home/rajkrishna/.local/lib/python3.8/site-packages/allennlp/commands/train.py", line 226, in train_model training_util.create_serialization_dir(params, serialization_dir, recover, force) File "/home/rajkrishna/.local/lib/python3.8/site-packages/allennlp/training/util.py", line 232, in create_serialization_dir os.makedirs(serialization_dir, exist_ok=True) File "/usr/lib/python3.8/os.py", line 213, in makedirs makedirs(head, exist_ok=exist_ok) File "/usr/lib/python3.8/os.py", line 213, in makedirs makedirs(head, exist_ok=exist_ok) File "/usr/lib/python3.8/os.py", line 213, in makedirs makedirs(head, exist_ok=exist_ok) File "/usr/lib/python3.8/os.py", line 223, in makedirs mkdir(name, mode) PermissionError: [Errno 13] Permission denied: '/media/disk1'
I am also attaching the screenshot:
![image](https://user-images.githubusercontent.com/91602311/147635085-395c40fa-e536-48fc-a6ac-7930ebfbd980.png)
Kindly help me resolve the error.
The text was updated successfully, but these errors were encountered: