process dataset
python process_dataset.py
run training
JSX is a syntax extension for JavaScript that lets you write HTML-like markup inside a JavaScript file. Although there are other ways to write components, most React developers prefer the conciseness of JSX, and most codebases use it. | |
You will learn | |
Why React mixes markup with rendering logic | |
How JSX is different from HTML | |
How to display information with JSX | |
JSX: Putting markup into JavaScript | |
The Web has been built on HTML, CSS, and JavaScript. For many years, web developers kept content in HTML, design in CSS, and logic in JavaScript—often in separate files! Content was marked up inside HTML while the page’s logic lived separately in JavaScript: | |
HTML markup with purple background and a div with two child tags: p and form. |
system = [ | |
"", | |
"You are an AI assistant. Provide a detailed answer so user don’t need to search outside to understand the answer.", | |
"You are an AI assistant. You will be given a task. You must generate a detailed and long answer.", | |
"You are a helpful assistant, who always provide explanation. Think like you are answering to a five year old.", | |
"You are an AI assistant that follows instruction extremely well. Help as much as you can.", | |
"You are an AI assistant that helps people find information. Provide a detailed answer so user don’t need to search outside to understand the answer.", | |
"You are an AI assistant. User will you give you a task. Your goal is to complete the task as faithfully as you can. While performing the task think step-by-step and justify your steps.", | |
"You should describe the task and explain your answer. While answering a multiple choice question, first output the correct answer(s). Then explain why other answers are wrong. Think like you are answering to a fiv |
Use the following pieces of context below to answer the question at the end. If you don’t know the answer, just say that you don’t know, don’t try to make up an answer. | |
Dear Shareholders, We are pleased to present the financial report for Company A for the fiscal year ended December 31, 2022. This report provides an overview of our financial performance, key highlights, and future prospects. We encourage you to review this report and gain insights into our company’s growth and financial standing. In 2022, Company A achieved a total revenue of $500M, representing a 5% increase compared to the previous year. This growth was primarily driven by increased sales across our product lines and expanded market presence. | |
Question: What was the revenue in 2022? | |
Helpful answer: |
data_pairs = [ | |
{ | |
"question": [ | |
"Where did you come from?", | |
"What is your place of origin?", | |
"From where did you originate?", | |
"Can you disclose your point of origin?", | |
"Where were you created?", | |
"What is the source of your existence?", | |
"What is the birthplace of your being?", |
{ | |
"code": "import matplotlib.pyplot as plt\r\nimport numpy as np\r\nfrom sagemaker import image_uris, model_uris, script_uris, instance_types\r\nfrom sagemaker.predictor import Predictor\r\nfrom sagemaker import get_execution_role\r\nimport json\r\n\r\n\r\nmodel_id, model_version = \"model-txt2img-stabilityai-stable-diffusion-v2-fp16\", \"*\"\r\n\r\ninference_instance_type = instance_types.retrieve_default(\r\n model_id=model_id, model_version=model_version, scope=\"inference\"\r\n)\r\n\r\n# Retrieve the inference docker container uri. This is the base HuggingFace container image for the default model above.\r\ndeploy_image_uri = image_uris.retrieve(\r\n region=None,\r\n framework=None, # automatically inferred from model_id\r\n image_scope=\"inference\",\r\n model_id=model_id,\r\n model_version=model_version,\r\n instance_type=inference_instance_type,\r\n)\r\n\r\n# Retrieve the inference script uri. This includes all dependencies and scripts for model loading, inference handling etc |
import gradio as gr | |
theme = gr.themes.Monochrome( | |
primary_hue="indigo", | |
secondary_hue="blue", | |
neutral_hue="slate", | |
radius_size="radius_sm", | |
font=[gr.themes.GoogleFont('Open Sans'), 'ui-sans-serif', 'system-ui', 'sans-serif'], | |
).set( | |
shadow_drop='*button_shadow' |
from transformers import AutoModelForSeq2SeqLM | |
from peft import PeftModel | |
# load PeftModel just with base model | |
base_model_id = "google/flan-t5-xxl" | |
model = AutoModelForSeq2SeqLM.from_pretrained(base_model_id) | |
# loads wrapper without adapters | |
model = PeftModel.from_pretrained(model) |
{ | |
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#", | |
"contentVersion": "1.0.0.0", | |
"metadata": { | |
"_generator": { | |
"name": "bicep", | |
"version": "0.12.40.16777", | |
"templateHash": "4423847801202994493" | |
} | |
}, |