Skip to content

Instantly share code, notes, and snippets.

@pashu123
Created August 8, 2022 14:17
Show Gist options
  • Save pashu123/34ff8ddc26623f5e1acc90083d860ae6 to your computer and use it in GitHub Desktop.
Save pashu123/34ff8ddc26623f5e1acc90083d860ae6 to your computer and use it in GitHub Desktop.
module attributes {torch.debug_module_name = "MaskedFillScalarDefaultModule"} {
func.func @forward(%arg0: !torch.vtensor<[2,3],f32>, %arg1: !torch.vtensor<[2,3],i1>) -> !torch.vtensor<[2,3],f32> {
%float5.000000e-01 = torch.constant.float 5.000000e-01
%none = torch.constant.none
%false = torch.constant.bool false
%0 = torch.aten.tensor.float %float5.000000e-01, %none, %none, %false : !torch.float, !torch.none, !torch.none, !torch.bool -> !torch.vtensor<[],f32>
%1 = torch.aten.Float.Tensor %0 : !torch.vtensor<[],f32> -> !torch.float
%2 = torch.aten.masked_fill.Scalar %arg0, %arg1, %1 : !torch.vtensor<[2,3],f32>, !torch.vtensor<[2,3],i1>, !torch.float -> !torch.vtensor<[2,3],f32>
return %2 : !torch.vtensor<[2,3],f32>
}
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment