Skip to content

Instantly share code, notes, and snippets.

View adamlin120's full-sized avatar

Yen-Ting Lin adamlin120

View GitHub Profile
@adamlin120
adamlin120 / accelerate_config.yaml
Last active February 17, 2025 16:44
Launch Scripts for Multi-node Training with Axolotl
compute_environment: LOCAL_MACHINE
debug: false
distributed_type: MULTI_GPU
downcast_bf16: 'no'
gpu_ids: all
machine_rank: 0
main_training_function: main
mixed_precision: bf16
num_machines: 1
num_processes: 8
@adamlin120
adamlin120 / dpo.yaml
Last active August 3, 2024 02:33
example_axolotl_config_for_dpo_on_tw_chatbot_arena_data
base_model: yentinglin/Llama-3-Taiwan-8B-Instruct
load_in_8bit: false
load_in_4bit: false
strict: false
hub_model_id: yentinglin/Llama-3-Taiwan-8B-Instruct-DPO
hub_strategy: end
wandb_name: 8b dpo
那我們進入市場的話題啊。近期在市場上又發生了一些第一次,我必須得說,從我們節目開錄以來,真的是活見鬼,看到很多很厲害的事情,呃,先是熔斷嘛,然後巴菲特可能一輩子看一個壟斷,然後我們看了嘛,六七個巴菲特在80幾歲的時候密集地看到壟斷好,本來80幾歲以前只看過一次,然後現在密集地看到一大堆壟斷往下往上都有,那之後呢又發生了WTI期貨掉入負值這件事情,雖然大家知道商品期貨理論上可以進入負值,可是當你真正看到的時候,還是。充滿了震撼,然後那一步也死了,蠻多人的,那到比較近期的話呢,就是那天然氣的2月起獲在他結算之前呢,往上噴了大概六70趴,那一次並不是因為天然氣真正的大幅上漲,還怎麼樣算天然氣是一個上升趨勢,一直在爬哦,這個有很多綜合因素,歐洲這邊特別冷啊,可能要打戰啊什麼的,但是呢那時候的上漲比較像是一個我覺得是有計劃的嘎空啦,跟那個跌到負值,一樣應該是有大莊家在在玩哦。發現說,你們這些人在惡搞罵,直接一波帶走,那我們就觀察到2月的天然氣的期貨大暴漲,科那時候三四月就是元月的呢,是沒有看到這樣的一個狀況,也就是說那是一個呃,可能你今天空單不太多,然後被人家一波收掉的一個狀況,那再到近期呢,就是我們看到呃,Facebook大跌20幾趴,那它噴掉了2000多億的市值非常扯好,這真的非常扯,這也是創了一個歷史記錄,你從來沒有想過嗎?大圈枝可以一天噴掉。這麼多,那再來呢,就是在當天的盤後,看到亞馬遜亞馬遜在盤中是跟著Facebook,因為Facebook大跌,所以整個納斯達克甚至紐交所很多都跟著下去,但是呢,在盤後亞馬遜財報出來之後呢?那他就大幅地往上噴了十六20788忘記,反正他這個往上噴的幅度呢,也是創了權值股的一個記錄,所以我們在近期又密集地看到新的紀錄而產生,那我覺得近期真的是老樣子啊,就是你不要花太多時間去解釋。那種短線的漲跌幅像啊,前幾天就有一個臺灣的記者,好像沒有屬名啦,也不敢庶民啦,因為它藥效那個rehastim嘛,就是說哎呀,那fill的老闆抄底失敗啊,什麼你花了5.6億,然後賠600萬,我想說* ,你有沒有看你在寫什麼5.6億,然後你賬上暫時賠600萬,這個是一趴的震盪欸一趴震蕩你都在那邊哭的話你不要玩股票好了吼,那再來就是馬上就買進Netflix激勵之下就往上噴一發,你不覺得尷尬嗎?那時候你要寫然後。吐槽他,然後他反彈啊,你家要怎麼自圓其說,所以其實會漲跌找理由,很多時候就發生這樣的狀況,然後再來就亞馬遜
@adamlin120
adamlin120 / results_2024-04-05T19-00-21.717283.json
Created April 5, 2024 16:25
DBRX-instruct Open LLM Leaderboard (by lighteval)
{
"config_general": {
"lighteval_sha": "?",
"num_fewshot_seeds": 1,
"override_batch_size": 1,
"max_samples": null,
"job_id": "",
"start_time": 797173.72261996,
"end_time": 867125.5408054,
"total_evaluation_time_secondes": "69951.81818544003",
base_model: mistralai/Mistral-7B-v0.1
load_in_8bit: false
load_in_4bit: false
strict: false
datasets:
- path: erhwenkuo/wikipedia-zhtw
type: completion
base_model: NousResearch/Llama-2-7b-hf
model_type: LlamaForCausalLM
tokenizer_type: LlamaTokenizer
load_in_8bit: false
load_in_4bit: false
strict: false
dataset:
- path: yentinglin/books
@adamlin120
adamlin120 / mistral.yml
Created January 5, 2024 01:42
Sample axolotl config for australian legal corpus pretraining on mistral 7b with qlora
base_model: mistralai/Mistral-7B-v0.1
model_type: MistralForCausalLM
tokenizer_type: LlamaTokenizer
is_mistral_derived_model: true
load_in_8bit: false
load_in_4bit: true
strict: false
datasets:
@adamlin120
adamlin120 / mixtral-zhtw.yml
Last active December 24, 2023 00:58
Taiwan-LLM-MoE-chat-alpha based on Mixtral-8x7B-v0.1
base_model: mistralai/Mixtral-8x7B-Instruct-v0.1
model_type: AutoModelForCausalLM
tokenizer_type: LlamaTokenizer
trust_remote_code: true
load_in_8bit: false
load_in_4bit: true
strict: false
datasets:
#!/usr/bin/env python
# coding=utf-8
# Copyright 2020 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
import torch
from transformers import pipeline
classifier = pipeline("zero-shot-classification", "symanto/xlm-roberta-base-snli-mnli-anli-xnli", device=0 if torch.cuda.is_available() else -1)
sequence_to_classify = "one day I will see the world"
candidate_labels = ['boao 博鰲']
classifier(sequence_to_classify, candidate_labels)