The one using a proper PRNG is more typical.
The other one is more like the paper intended, I think, which is to literally run a side-effect on
every call to choice (lazy I/O), which is what this does (modulo buffered reads).
| -- needs to be a `M i o a', but almost there. LH is difficult (bad docs, bad errors, changing syntax) | |
| {-# LANGUAGE GADTs #-} | |
| {-# OPTIONS_GHC -fplugin=LiquidHaskell #-} | |
| {-# language GeneralizedNewtypeDeriving, DeriveFunctor #-} | |
| {-@ LIQUID "--prune-unsorted" @-} | |
| import Control.Monad.State | |
| import Data.Set qualified as Set |
| {-# OPTIONS_GHC -fplugin=LiquidHaskell #-} | |
| {-# language OverloadedStrings, BangPatterns #-} | |
| import Data.Word | |
| import Foreign.Ptr | |
| import Foreign.C.Types | |
| import Foreign.Marshal.Alloc | |
| import System.Posix.IO.ByteString | |
| import System.Posix.Types | |
| main :: IO () |
| data Todo = Todo { | |
| id :: Text, | |
| created :: UTCTime, | |
| title :: Text, | |
| description :: Text, | |
| priority :: Int | |
| } | |
| data Command | |
| = Add Main.Todo |
| name: Create Functional Pearl discussion weekly | |
| on: | |
| schedule: | |
| - cron: "0 12 * * 5" # 5 = Friday (12:00 UTC) | |
| workflow_dispatch: | |
| jobs: | |
| create-discussion: | |
| runs-on: ubuntu-latest |
The one using a proper PRNG is more typical.
The other one is more like the paper intended, I think, which is to literally run a side-effect on
every call to choice (lazy I/O), which is what this does (modulo buffered reads).
| data Opts = Opts { | |
| startDate :: Text, | |
| step :: Text | |
| } | |
| options = | |
| (\startDate step -> Main.Opts { startDate, step }) | |
| <$> Options.strOption (Option.long "start-date" <> Option.help "Starting date in 2025-01-24 format.") | |
| <*> Options.strOption (Option.long "step" <> Option.help "How many days to add each line.") |
| // Create the context menu item when the extension is installed | |
| browser.contextMenus.create({ | |
| id: "open-link-temporarily", | |
| title: "Open Link temporarily", | |
| contexts: ["link"] | |
| }); | |
| // Track tabs that should be auto-closed | |
| const tempTabs = new Map(); |
| newtype T1 t a = T1 { unT1 :: forall e. t e a } | |
| deriving instance (forall e. Functor (t e)) => Functor (T1 t) | |
| instance (forall e. Applicative (Either e)) => Applicative (T1 Either) where | |
| pure x = T1 (pure x) | |
| (<*>) f x = T1 (unT1 f <*> unT1 x) | |
| instance (forall e. Monad (Either e)) => Monad (T1 Either) where | |
| (>>=) m f = T1 (unT1 m >>= unT1 . f) |
| newtype L c t = L (forall e. Dict (c (t e))) | |
| withL :: forall c t r. L c t -> (forall e. Dict (c (t e)) -> r) -> r | |
| withL (L dict) r = r dict | |
| functorEitherL :: L Functor Either | |
| functorEitherL = L Dict | |
| someFuntorEitherL :: Dynamic | |
| someFuntorEitherL = toDyn functorEitherL |
| Llama-3.2-3B-Instruct-uncensored.Q4_K_M | |
| summarise: https://chrisdone.com/posts/llms/ | |
| > llama-summarise-buffer: | |
| The document is a personal reflection by the author, a researcher, on their experiences with large language models (LLMs) | |
| from September 23, 2025, discussing their applications, limitations, and concerns regarding their impact. | |
| > summarise whole thing: |