Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save nikolaydubina/02e6293b00bef444fad332a280324923 to your computer and use it in GitHub Desktop.
Save nikolaydubina/02e6293b00bef444fad332a280324923 to your computer and use it in GitHub Desktop.
// 🍍🍌πŸ₯πŸπŸŒπŸ₯πŸπŸŒπŸ₯πŸπŸŒπŸ₯πŸπŸŒπŸ₯πŸπŸŒπŸ₯
// time: O(N)
// space: O(1)
// why: making smoothie! 🍧
package main
import (
"fmt"
"math/rand"
)
type Node struct {
V string
Next *Node
}
func sampleUniform(l *Node) *Node {
if l == nil {
return nil
}
r := l
count := 0
for q := l; q != nil; q = q.Next {
count += 1
if rand.Float64() < 1/float64(count) {
r = q
}
}
return r
}
func main() {
l := &Node{V: "πŸ₯", Next: &Node{V: "🍍", Next: &Node{V: "🍌"}}}
count := make(map[string]int)
for range 1000 {
count[sampleUniform(l).V]++
}
fmt.Print(count)
// Output: map[🍌:336 🍍:340 πŸ₯:324]
}
@nikolaydubina
Copy link
Author

nikolaydubina commented Jan 25, 2022

for truly uniform sampling probability for each is 1/N. but when N -> infinity, then probability of picking any is approaching 0. so algorithms probability has to approach 0 too. meaning it will underfloat float value and will never trigger. so basically you should never select any value and keep iterating.

now, this is different from non-infinite but undetermined N. in which case probability is non zero, but unknown and depends on number of elements. we basically have to go through whole sequence. we have to decrease probability of picking last one as we go.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment