Skip to content

Instantly share code, notes, and snippets.

@jw3126
Created May 7, 2020 23:14
Show Gist options
  • Save jw3126/123ea8566152317cd851edf1186e9b1f to your computer and use it in GitHub Desktop.
Save jw3126/123ea8566152317cd851edf1186e9b1f to your computer and use it in GitHub Desktop.
argmin |t*xs - ys|
using Test
using Convex1D
using BenchmarkTools
function minimize_scaled_L1_diff(xs, ys)
# find t::Number that minimizers f(t) = sum(abs, t*xs - ys)
# f is convex with piecewise constant derivative given by
# f'(t) = Σ xi * sign(t*xi - yi)
# One of the points ti := yi/xi must be a minimizer (for some xi !=0. If all xi==0 then f == const anyway)
# Based on remarks of Mathieu Tanneau in slack
x1 = one(eltype(xs))
y1 = one(eltype(ys))
T = typeof(y1 / x1)
# we start searching at t = -Inf
# there the gradient is negative
# we need to find the first ti, where the gradient becomes non negative
df_neginf = sum(xs) do x
x*sign(-x)
end
if df_neginf == 0
@assert x == zero(x)
return zero(T)
else
@assert df_neginf < 0
end
df = df_neginf
ts = ys ./ xs
crossings = sortperm(ts)
for i in crossings
xi = xs[i]
df += 2xi*sign(xi)
if df >= 0
return ts[i]
end
end
error("Unreachable")
end
N = 10^3
const xs = randn(N)
const ys = randn(N)
f = t -> sum(abs, t*xs - ys)
sol = @btime minimize(f, (-1000.0, 1000.0), atol=1e-15)
t_min = @btime minimize_scaled_L1_diff(xs, ys)
@test t_min ≈ sol.minimizer
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment