Skip to content

Instantly share code, notes, and snippets.

@vipmax
vipmax / zed_tasks.md
Created October 15, 2024 12:18
zed tasks cwd example

Zed Tasks Configuration

This configuration allows Zed to run tasks with specific working directories for multiple projects.

Tasks Configuration

[
  {
 "label": "backend",
@vipmax
vipmax / docker-compose.yml
Last active October 8, 2024 19:53
minimal docker-compose apache kafka broker and kafka-ui
version: '3.8'
networks:
my_network:
driver: bridge
volumes:
kafka-data:
kafka-ui-data:
services:
@dhbrojas
dhbrojas / document.rs
Created December 8, 2023 04:06
Under-Tested Implementation of Applying LSP Content Changes to Ropey and Tree Sitter
use ropey::{Rope, RopeSlice};
use serde::{Deserialize, Serialize};
use std::fmt;
use thiserror::Error;
use tower_lsp::lsp_types::{Position, TextDocumentContentChangeEvent};
use tree_sitter::{InputEdit, Parser, Point, Tree};
pub struct TextDocument {
pub rope: Rope,
pub tree: Option<Tree>,
@vipmax
vipmax / readme.md
Created October 13, 2023 11:38
websocket send with response

websocket send with response

By default with WebSocket you only can send message.

0. Initialization

let ws = new WebSocket('ws://localhost:3000/ws')

ws.onopen = function(event) {
  console.log("on open")
}
@vipmax
vipmax / scala 3 and apache spark.md
Created November 10, 2022 10:38
scala 3 and apache spark

scala 3 and apache spark

1. Install The Scala installer is a tool named Coursier

mac os

brew install coursier/formulas/coursier && cs setup

linux

@pkozelka
pkozelka / both_mixed_async.rs
Last active February 7, 2024 17:49
Process execution in Rust
use std::path::Path;
use std::process::Stdio;
use tokio::io::BufReader;
use tokio::io::AsyncBufReadExt;
use tokio::process::Command;
use tokio::sync::mpsc;
/// Execute a process, gather its mixed outputs into stdout
///
@vipmax
vipmax / kafka concurrent batch commit
Last active September 20, 2018 15:55
kafka concurrent batch commit
val kafkaProps = new Properties()
kafkaProps.put("bootstrap.servers", endpoint)
kafkaProps.put("key.serializer", classOf[ByteArraySerializer])
kafkaProps.put("key.deserializer", classOf[ByteArrayDeserializer])
kafkaProps.put("value.serializer", classOf[ByteArraySerializer])
kafkaProps.put("value.deserializer", classOf[ByteArrayDeserializer])
kafkaProps.put("group.id", "CrawlerTasksStorage")
kafkaProps.put("max.poll.records", "1000")
kafkaProps.put("enable.auto.commit","false")

Kafka installation with systemd

0. Create kafka user

sudo adduser kafka
sudo adduser kafka sudo
su -l kafka

1. Download and Install kafka archive

@razor-x
razor-x / server.py
Last active June 18, 2024 14:27
Python http.server that sets Access-Control-Allow-Origin header.
# Python http.server that sets Access-Control-Allow-Origin header.
# https://gist.github.com/razor-x/9542707
import os
import sys
import http.server
import socketserver
PORT = 8000
@hanbzu
hanbzu / stream_concat.scala
Created November 6, 2013 14:18
Scala: Stream concatenation. Thanks to Pavel Lepin.
val a = Stream(1)
//a: scala.collection.immutable.Stream[Int] = Stream(1, ?)
def b: Stream[Int] = Stream(b.head)
//b: Stream[Int]
a #::: b
//res0: scala.collection.immutable.Stream[Int] = Stream(1, ?)
a append b