See how a minor change to your commit message style can make you a better programmer.
Format: <type>(<scope>): <subject>
<scope> is optional
| __pycache__ |
| { | |
| "firstNames": [ | |
| "Crimson", | |
| "Cinnamon", | |
| "Silent", | |
| "Sheik", | |
| "Dreamy", | |
| "Wondering", | |
| "Spicy", | |
| "Catchy", |
| use std::collections::HashMap; | |
| use std::fmt::Debug; | |
| pub trait StaticBuilder { | |
| fn build(&self) -> Box<HasName>; | |
| } | |
| struct UserBuilder; | |
| impl StaticBuilder for UserBuilder { |
The problem with large language models is that you can’t run these locally on your laptop. Thanks to Georgi Gerganov and his llama.cpp project, it is now possible to run Meta’s LLaMA on a single computer without a dedicated GPU.
There are multiple steps involved in running LLaMA locally on a M1 Mac after downloading the model weights.
| # spec/rails_helper.rb | |
| if ENV['SAVE_SCREENSHOTS'] | |
| module CapybaraElementExtensions | |
| INTERACTION_METHODS = %i[set select_option unselect_option click | |
| right_click double_click send_keys hover trigger drag_to execute_script | |
| evaluate_script evaluate_async_script] | |
| INTERACTION_METHODS.each do |method| | |
| define_method method do |*args, &block| |