This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| #!/bin/bash | |
| # This script downloads and compiles the Ontonotes 2012 data in a helpful format | |
| # for co-reference resolution. It generates 3 files: {train, dev, test}.english.v4_gold_conll, | |
| # as well as a directory 'conll-2012' which contains the raw extracted data. | |
| # The script downloads and runs some python scripts which require python 2.X. | |
| ONTONOTES_PATH=$1 | |
| LANGUAGE=$2 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| # Adapted from the triton implementation of flash-attention v2 | |
| # https://github.com/openai/triton/blob/main/python/tutorials/06-fused-attention.py | |
| import time | |
| import torch | |
| import torch.utils.benchmark as benchmark | |
| import triton | |
| import triton.language as tl | |
| @triton.jit |
OlderNewer