Skip to content

Instantly share code, notes, and snippets.

@rxwei
rxwei / swift-charts-framework-internship.md
Last active December 6, 2023 16:58
Swift Charts Framework Internship at Apple

Swift Charts Framework Internship at Apple

Note: Applications are now closed.

Apple's Swift Charts team is now looking for interns for 2024!

This is a paid, in-person internship in Cupertino, California. While most internships last 3 months, starting in May or June, the start date and internship length are flexible. Internships are restricted to students pursuing a Bachelor's, Master's, or PhD degree. Students must be enrolled in school in

@rxwei
rxwei / main.swift
Last active January 21, 2020 06:42
Incremental computation with differentiable programming https://arxiv.org/pdf/1312.0658.pdf
// To be shared on Swift Forums.
// Compile with:
// swiftc -Xllvm -enable-experimental-cross-file-derivative-registration -enable-experimental-forward-mode-differentiation main.swift
// MARK: - Make integers differentiable
extension Int: Differentiable {
public typealias TangentVector = Int
}
@rxwei
rxwei / zeroTangentVectorInitializer.swift
Last active October 7, 2019 22:41
Zero tangent vector initializer
import TensorFlow
protocol Differentiable {
...
var zeroTangentVectorInitializer: () -> Self { get }
}
extension Tensor where Scalar: TensorFlowFloatingPoint {
var zeroTangentVectorInitializer: () -> Self {
{ [shape = self.shape] in Tensor(zeros: shape) }
@rxwei
rxwei / upstreaming-swift-autodiff.md
Last active January 28, 2020 11:06
Upstreaming Swift AutoDiff

Upstreaming Swift AutoDiff

Author: Richard Wei ([email protected]) on behalf of the Swift for TensorFlow team

Last updated: October 2, 2019

Overview

The differentiable programming feature (AutoDiff) has been incubated in the 'tensorflow' branch of apple/swift since December 2017 and released as part of the Swift for TensorFlow toolchains. The Differentiable Programming Mega-Proposal, which serves as a manifesto, received general positive feedback from the community, but there is a long way between receiving conceptual approval and obtaining Swift Evolution approval of such a large feature. We would like to merge the pieces into the 'master' branch under a gate to further development and bake the feature on master, just like Apple develops its major features

@rxwei
rxwei / nested_loop.swift
Created September 29, 2019 01:39
Instruction count comparison
@differentiable
func nested_loop(_ x: Float, count: Int) -> Float {
var outer = x
outerLoop: for _ in 1..<count {
outer = outer * x
var inner = outer
var i = 1
while i < count {
inner = inner + x
@rxwei
rxwei / dot-pullback-ad-all-indirect.sil
Created September 24, 2019 20:29
Zero initialization considered harmful!
// AD__dot__pullback_src_0_wrt_0_1
sil hidden [ossa] @AD__dot__pullback_src_0_wrt_0_1 : $@convention(thin) (@in_guaranteed Float, @owned _AD__dot_bb0__PB__src_0_wrt_0_1) -> (@out Vector, @out Vector) {
// %0 // user: %202
// %1 // user: %203
// %2 // user: %65
// %3 // user: %66
bb0(%0 : $*Vector, %1 : $*Vector, %2 : $*Float, %3 : @owned $_AD__dot_bb0__PB__src_0_wrt_0_1):
%4 = alloc_stack $Tracked<Float> // users: %233, %232, %196, %188, %7
%5 = witness_method $Tracked<Float>, #AdditiveArithmetic.zero!getter.1 : <Self where Self : AdditiveArithmetic> (Self.Type) -> () -> Self : $@convention(witness_method: AdditiveArithmetic) <τ_0_0 where τ_0_0 : AdditiveArithmetic> (@thick τ_0_0.Type) -> @out τ_0_0 // user: %7
%6 = metatype $@thick Tracked<Float>.Type // user: %7
@rxwei
rxwei / gist:2faf1e8f3a5b2d93acf5891d2d1c4654
Created May 29, 2019 09:42
New autodiff_function instruction
%0 = differentiable_function (%f, %f_jvp) : $sil_differentiable(1) {(T) -> T, differential: (T) -> T, pullback: (T) -> T}
%0 = differentiable_function (%f, %f_transpose) : $sil_differentiable(linear) {(T) -> T, transpose: (T) -> T}
@rxwei
rxwei / gist:e1488cac5379ba2bc3aff7490e18158f
Created May 20, 2019 00:57
Broadcast operator gradient performance test
import TensorFlow
import Dispatch
func foo(x: Tensor<Float>, y: Tensor<Float>) -> Tensor<Float> {
return (x + x + x * y * y * y).sum()
}
func time(_ body: () -> Void) {
let divisor: Float = 1_000_000_000
let start = Float(DispatchTime.now().uptimeNanoseconds) / divisor
@rxwei
rxwei / gist:ccd93fb8a801f2aff59004722ed7c235
Created May 1, 2019 21:59
stdlib/TensorFlow Refactoring
- (Anthony) Switch swift-bindings to eager and `TFE_Execute`.
- (Anthony) Remove all `#tfop`s from stdlib/TensorFlow.
- (Marc) Remove local constexpr.
- (Parker) Remove IRGen logic from `graph_op`.
- (Parker) Remove GPE files.
- (Parker) Remove `#tfop`.
- (Parker) Remove TensorFlow Bazel build rules from build-script.
- (Parker) Set up GPU CI in tensorflow/swift-apis.
@rxwei
rxwei / blank_swift.ipynb
Created April 30, 2019 01:43
MiniGo package linker error
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.