To use the live-html (webr/pyodide) in quarto:
quarto add r-wasm/quarto-live
#' Workshop cost calculator | |
#' | |
#' This cost calculator give the **minimum** cost for a workshop of the | |
#' requested size, number or participants, and hours. | |
#' | |
#' @param vcpu numeric() amount of virtual cpu requested (can be fractional) | |
#' @param memory numeric() in GB the amount of requested memory | |
#' @param hours numeric() the number of hours for each instance to run | |
#' The workshops will run for this amount of time even if a participant | |
#' is not using it. So, every launch results in a cost. |
--- | |
title: "Github Repos for Topic" | |
author: "Sean Davis" | |
format: html | |
params: | |
gh_topic: "r01ca230551" | |
--- | |
## Required packages |
# Install Bioconductor and required packages | |
# This command installs the BiocManager package for managing Bioconductor packages | |
install.packages("BiocManager") | |
# Use BiocManager to install specific packages for data analysis and visualization | |
BiocManager::install(c("GEOquery", "SummarizedExperiment", "ggplot2", | |
"party", "ggparty", "partykit", "randomForest")) | |
# Load necessary packages for modeling and visualization | |
library(party) # For creating classification trees |
Nextflow is a powerful workflow management system designed for creating scalable and reproducible scientific workflows. It enables you to write workflows in a declarative language, making it easy to define complex pipelines that can be executed on various platforms, including local machines, clusters, and cloud environments like Google Cloud.
This short tutorial is meant for informatics users who are comfortable with a command line interface. It also assumes that the user is familiar with and has run nextflow on a local computer or HPC system.
Roughly, this document will walk through:
#!/bin/bash | |
curl https://gis.cdc.gov/Cancer/DataVizApi/GetJSON/USCS_County | sed -e 's/<string xmlns="http:\/\/schemas.microsoft.com\/2003\/10\/Serialization\/">//g' -e 's/<\/string>//g'| jq -c '.[] | .USCS_County[]' > output.jsonl |
Sure! Here's the converted Docker Compose YAML file with a MySQL server as a separate container and a Docker volume for storage:
version: '3'
services:
wandb-local:
image: wandb/local
container_name: wandb-local
environment:
- HOST=https://YOUR_DNS_NAME
You are an HR specialist and are evaluating the qualifications of job applicants | |
for a high-performance computing (HPC) specialist position. | |
You have been given a set of criteria to evaluate each candidate. | |
The candidate materials are in the attached PDF. | |
For each job applicant, fill in the following YAML-format criteria document. You | |
may use the "comment" field to provide additional context or justification for | |
your evaluation. | |
--- | |
# candidate name |
# convert all CMGD SummarizedExperiments to CSV files | |
# Should run more-or-less directly as a script | |
# Requires more than 128GB RAM to complete | |
# Generates about 200GB of files | |
# BiocManager::install('curatedMetagenomicData') | |
# BiocManager::install(c('arrow','data.table','dplyr', 'readr')) | |
library(curatedMetagenomicData)convert all CMGD SummarizedExperiments to CSV files |