Skip to content

Instantly share code, notes, and snippets.

View maziyarpanahi's full-sized avatar
😎
Building a private medical ChatGPT!

Maziyar Panahi maziyarpanahi

😎
Building a private medical ChatGPT!
View GitHub Profile
Mapping citations
First, I’m identifying each citation within the text, matching them to the Documents section. This involves noting the citation format and ensuring relevancy to the sentence it supports.
Cross-checking citations
I’m piecing together how the RESPONSE references documents like “According to Document 3” and “Document 9 highlights the importance of psychological flexibility” to ensure the citations accurately reflect the content in the DOCUMENTS.
Verifying citations
Interestingly enough, I’m checking the accuracy of each citation in the RESPONSE by cross-referencing them with the REFERENCES section. This guarantees precise alignment with the document.
Verifying citations
The assistant is instructed to scrutinize citations in the RESPONSE for relevance, cross-reference them with the REFERENCES section, and confirm their accuracy.
ACTIONS NEEDED:
I will read the RESPONSE paragraph by paragraph, assessing citations for appropriateness. Summarizing content may be necessary due to length.
base_model: /workspace/models/Mistral-Nemo-Base-2407
model_type: AutoModelForCausalLM
tokenizer_type: AutoTokenizer
load_in_8bit: false
# load_in_4bit: true
strict: false
datasets:
- path: /workspace/datasets/dolphin-2.9.3/dolphin201-sharegpt2.jsonl
File "/root/miniconda3/envs/py3.10/lib/python3.10/site-packages/trl/trainer/utils.py", line 338, in __call__
to_pad = [torch.LongTensor(ex[k]) for ex in features]
File "/root/miniconda3/envs/py3.10/lib/python3.10/site-packages/trl/trainer/utils.py", line 338, in <listcomp>
to_pad = [torch.LongTensor(ex[k]) for ex in features]
File "/root/miniconda3/envs/py3.10/lib/python3.10/site-packages/trl/trainer/utils.py", line 338, in <listcomp>
to_pad = [torch.LongTensor(ex[k]) for ex in features]
TypeError: 'NoneType' object cannot be interpreted as an integer
to_pad = [torch.LongTensor(ex[k]) for ex in features]
TypeError: 'NoneType' object cannot be interpreted as an integer
return inner_training_loop(

Let's checkout the PR:

git fetch origin pull/625/head:dbrx
git switch dbrx
pip install -vvv --no-build-isolation -e .

Download the model:

@maziyarpanahi
maziyarpanahi / miqu-upload-hf.py
Created February 9, 2024 22:54 — forked from 152334H/miqu-upload-hf.py
upload miqu ckpt to hf
from transformers import LlamaConfig as LC, LlamaForCausalLM as LLM, LlamaTokenizer as LT
from accelerate import init_empty_weights, load_checkpoint_and_dispatch
import torch
lt = LT.from_pretrained("NousResearch/Llama-2-7b-hf")
c = LC.from_pretrained("NousResearch/Llama-2-70b-hf")
c.max_position_embeddings = 32764
c.rope_theta = 1000000
with init_empty_weights(): m = LLM(c)
m = m.half().eval()
m.requires_grad_(False)
@maziyarpanahi
maziyarpanahi / gguf-merge.sh
Created February 2, 2024 08:30 — forked from crasm/gguf-merge.sh
Shell script for merging TheBloke's .gguf-split model files
#!/bin/sh
log() {
format="$1"; shift
# shellcheck disable=SC2059
>&2 printf "$format\n" "$@"
}
usage() {
>&2 cat <<EOF
@maziyarpanahi
maziyarpanahi / PVE-HP-ssacli-smart-storage-admin.md
Created March 31, 2023 09:26 — forked from mrpeardotnet/PVE-HP-ssacli-smart-storage-admin.md
HP Smart Storage Admin CLI (ssacli) installation and usage on Proxmox PVE (6.x)

HP Smart Storage Admin CLI (ssacli) installation and usage on Proxmox PVE (6.x)

Why use HP Smart Storage Admin CLI?

You can use ssacli (smart storage administrator command line interface) tool to manage any of supported HP Smart Array Controllers in your Proxmox host without need to reboot your server to access Smart Storage Administrator in BIOS. That means no host downtime when managing your storage.

CLI is not as convenient as GUI interface provided by BIOS or desktop utilities, but still allows you to fully manage your controller, physical disks and logical drives on the fly with no Proxmox host downtime.

ssacli replaces older hpssacli, but shares the same syntax and adds support for newer servers and controllers.

Installation

@maziyarpanahi
maziyarpanahi / tours.json
Created January 19, 2023 14:49
tours.json
[
{
"tourBlurb" : "Big Sur is big country. The Big Sur Retreat takes you to the most majestic part of the Pacific Coast and show you the secret trails.",
"tourName" : "Big Sur Retreat",
"tourPackage" : "Backpack Cal",
"tourBullets" : "\"Accommodations at the historic Big Sur River Inn, Privately guided hikes through any of the 5 surrounding national parks, Picnic lunches prepared by the River Inn kitchen, Complimentary country breakfast, Admission to the Henry Miller Library and the Point Reyes Lighthouse \"",
"tourRegion" : "Central Coast",
"tourDifficulty" : "Medium",
"tourLength" : 3,
"tourPrice" : 750,
from sparknlp.annotator import *
from sparknlp.base import *
from pyspark.ml import Pipeline
imageAssembler = ImageAssembler() \
.setInputCol("image") \
.setOutputCol("image_assembler")
imageClassifier = ViTForImageClassification \
from transformers import ViTFeatureExtractor, ViTForImageClassification
from transformers import pipeline
import torch
device = "cuda:0" if torch.cuda.is_available() else "cpu"
print(device)
feature_extractor = ViTFeatureExtractor.from_pretrained('google/vit-base-patch16-224')
model = ViTForImageClassification.from_pretrained('google/vit-base-patch16-224')
model = model.to(device)