Skip to content

Instantly share code, notes, and snippets.

View durul's full-sized avatar

durul dalkanat durul

View GitHub Profile
@MainActor
func placeInRealityView(modelURL: URL, into content: RealityViewContent, under parent: Entity?) async throws {
let entity = try await ModelEntity(contentsOf: modelURL)
var mc = ManipulationComponent()
ManipulationComponent.configureEntity(entity, collisionShapes: nil)
entity.components.set(mc)
let anchor = AnchorEntity(world: .zero)
content.add(anchor)
anchor.addChild(entity)

🖊️ Logitech Muse Stylus Integration

Complete visionOS implementation for tracking Logitech Muse stylus with real-time position updates, button input, and haptic feedback.

🚀 Quick Start

1. Prerequisites

  • visionOS 26+
  • Logitech Muse stylus
@durul
durul / gist:5af30ac5a2379c0e569709558b1911a6
Last active October 13, 2025 13:53
SwiftUI: Segmented “Pill” Control (Location / Settings)

✅ SwiftUI: Segmented “Pill” Control (Location / Settings)

A reusable SwiftUI component that matches the look in your screenshot: rounded “pill” container, sliding thumb, SF Symbols, bold active label, dimmed inactive.

import SwiftUI

// MARK: - Model
enum SegTab: String, CaseIterable, Identifiable {
    case location = "Location"
@durul
durul / HDMI on Apple Vision Pro.md
Created August 30, 2025 16:32 — forked from KhaosT/HDMI on Apple Vision Pro.md
Guide for using Apple Vision Pro as HDMI display

Displaying HDMI sources on Apple Vision Pro

While it's possible to stream most content to Apple Vision Pro directly over the internet, having the ability to use Apple Vision Pro as an HDMI display can still be useful.

Since Apple Vision Pro does not support connecting to an HDMI input directly or using an HDMI capture card, we have to be a little creative to make this work. NDI provides the ability to stream HDMI content over a local network with really low latency, and it works great with Apple Vision Pro.

This page shows the setup I’m using.

@durul
durul / CHATGPT VERSION (GPT-4 | GPT-4.1)
Created June 19, 2025 14:19 — forked from iamnolanhu/CHATGPT VERSION (GPT-4 | GPT-4.1)
REALITY FILTER — A LIGHTWEIGHT TOOL TO REDUCE LLM FICTION WITHOUT PROMISING PERFECTION
✅ REALITY FILTER — CHATGPT
• Never present generated, inferred, speculated, or deduced content as fact.
• If you cannot verify something directly, say:
- “I cannot verify this.”
- “I do not have access to that information.”
- “My knowledge base does not contain that.”
• Label unverified content at the start of a sentence:
- [Inference] [Speculation] [Unverified]
• Ask for clarification if information is missing. Do not guess or fill gaps.
import SwiftUI
struct HarmonicButton: View {
var body: some View {
Button(
action: {},
label: {}
)
.frame(width: 240.0, height: 70.0)
.buttonStyle(HarmonicStyle())
import SwiftUI
import Combine
struct CountdownView: View {
let countdownSeconds: Int = 10
let numberOfDivision: Int = 36
let handSize: CGSize = .init(width: 8, height: 24)
let radius: CGFloat = 100
@State var count: Int = 10
//
// DotGridView.swift
//
import SwiftUI
struct DotPosition: Equatable, Hashable {
let row: Int
let column: Int
}
@durul
durul / gist:3e9034f5a75c4209b19e9b5554fdeedd
Created January 1, 2025 16:49
Dynamic Sphere Animation in SwiftUI with TimelineView and Canvas
import SwiftUI
struct SphereView: View {
@State var start = Date()
func createPoint(at angle: Double, radius: Double, time: Double, center: CGPoint, pointSize: Double) -> (path: Path, color: Color) {
let wobble = sin(time * 2 + radius / 10) * 10
let distanceModifier = 1 + sin(angle * 3 + time) * 0.1
let adjustedRadius = (radius + wobble) * distanceModifier