- Zoom abuses the installer flow on MacOS to bypass permissions dialogs (source)
- Zoom sends identifying device info to Facebook, even when users don't have a Facebook account (source) (fixed)
- A bug in Zoom sent identifying information (including email addresses and profile pictures) of thousands of users to strangers (source)
- Zoom claims that meetings are end-to-end encrypted in their white paper and marketing materials, but meetings are only encrypted in transit, and are available in plaintext to Zoom servers and employees. (source)
zoomAutenticationToolcan be used to escalat
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| import json | |
| import urllib.parse | |
| import boto3 | |
| print('Loading function') | |
| s3 = boto3.client('s3') | |
| def lambda_handler(event, context): | |
| #1 - Get the bucket name |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| import re | |
| ip_middle_octet = u"(?:\.(?:1?\d{1,2}|2[0-4]\d|25[0-5]))" | |
| ip_last_octet = u"(?:\.(?:[1-9]\d?|1\d\d|2[0-4]\d|25[0-4]))" | |
| regex = re.compile( | |
| u"^" | |
| # protocol identifier | |
| u"(?:(?:https?|ftp)://)" | |
| # user:pass authentication |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
- NLP 관련 다양한 패키지를 제공하고 있으며, 특히 언어 모델 (language models) 을 학습하기 위하여 세 가지 패키지가 유용
| package | note |
|---|---|
| transformers | Transformer 기반 (masked) language models 알고리즘, 기학습된 모델을 제공 |
| tokenizers | transformers 에서 사용할 수 있는 토크나이저들을 학습/사용할 수 있는 기능 제공. transformers 와 분리된 패키지로 제공 |
| nlp | 데이터셋 및 평가 척도 (evaluation metrics) 을 제공 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| import torch | |
| import copy | |
| from torch import nn | |
| from transformers import T5PreTrainedModel | |
| from transformers.models.t5.modeling_t5 import T5Stack | |
| from transformers.modeling_outputs import SequenceClassifierOutput | |
| from torch.nn import BCEWithLogitsLoss, CrossEntropyLoss, MSELoss | |
| def mean_pooling(inputs, mask): |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| import torch.nn as nn | |
| import torch.nn.functional as F | |
| import math | |
| from typing import Optional, Tuple | |
| class BertSelfAttention(nn.Module): | |
| def __init__(self, config): | |
| super().__init__() | |
| if config.hidden_size % config.num_attention_heads != 0: |
OlderNewer