Cloud Gaming is a great way to enjoy graphically demanding games on Apple Vision Pro.
Since Safari on visionOS does not support PWA mode, here is how you can access cloud gaming services on Apple Vision Pro.
import Foundation | |
extension String { | |
subscript(value: PartialRangeUpTo<Int>) -> Substring { | |
get { | |
return self[..<index(startIndex, offsetBy: value.upperBound)] | |
} | |
} | |
subscript(value: PartialRangeThrough<Int>) -> Substring { |
import torch | |
import torchvision | |
class VGGPerceptualLoss(torch.nn.Module): | |
def __init__(self, resize=True): | |
super(VGGPerceptualLoss, self).__init__() | |
blocks = [] | |
blocks.append(torchvision.models.vgg16(pretrained=True).features[:4].eval()) | |
blocks.append(torchvision.models.vgg16(pretrained=True).features[4:9].eval()) | |
blocks.append(torchvision.models.vgg16(pretrained=True).features[9:16].eval()) |
import Foundation | |
public extension String { | |
/** | |
Enables passing in negative indices to access characters | |
starting from the end and going backwards. | |
if num is negative, then it is added to the | |
length of the string to retrieve the true index. | |
*/ |
# ------------------------------------------------------------------ | |
# EDIT: I eventually found a faster way to run SD on macOS, via MPSGraph (~0.8s / step on M1 Pro): | |
# https://github.com/madebyollin/maple-diffusion | |
# The original CoreML-related code & discussion is preserved below :) | |
# ------------------------------------------------------------------ | |
# you too can run stable diffusion on the apple silicon GPU (no ANE sadly) | |
# | |
# quick test portraits (each took 50 steps x 2s / step ~= 100s on my M1 Pro): | |
# * https://i.imgur.com/5ywISvm.png |
Cloud Gaming is a great way to enjoy graphically demanding games on Apple Vision Pro.
Since Safari on visionOS does not support PWA mode, here is how you can access cloud gaming services on Apple Vision Pro.
While it's possible to stream most content to Apple Vision Pro directly over the internet, having the ability to use Apple Vision Pro as an HDMI display can still be useful.
Since Apple Vision Pro does not support connecting to an HDMI input directly or using an HDMI capture card, we have to be a little creative to make this work. NDI provides the ability to stream HDMI content over a local network with really low latency, and it works great with Apple Vision Pro.
This page shows the setup I’m using.