visionOSで空間演出を実現する方法

27.6K Views

August 24, 24

スライド概要

iOSDC 2024

シェア

またはPlayer版

埋め込む »CMSなどでJSが使えない場合

関連スライド

各ページのテキスト
1.

iOSDC 2024 visionOSで空間演出を実現する方法 服部 智 @shmdevelop

2.

https://www.apple.com/jp/newsroom/2024/06/apple-vision-pro-arrives-in-new-countries-and-regions-beginning-june-28/

3.

https://www.apple.com/jp/newsroom/2024/06/apple-vision-pro-arrives-in-new-countries-and-regions-beginning-june-28/

4.

空間的な演出が評判良いvisionOSのアプリ

5.

3D CG キャラクター ポータル表示 + 前面の3Dオブジェクト 手の動きを使ったインタラクション "Kung Fu Panda: School of Chi" "What If…? An Immersive Story" 3D CG キャラクター ポータル表示、VR表示 現実空間が見える ハンドトラッキングなど

6.

高品質な2D動画 空間的エフェクト インタラクティブ要素 ※ アップデートで体験要素追加あり "GUCCI" "Disney+" 2D動画視聴 3D動画もある 高品質のVR視聴環境

7.

今回のセッションでは

8.

GUCCI のような表現と機能をどう作るか

9.

Satoshi Hattori xR Engineer Cyber AI Productions Cyber Agent: Next AR Experts Host of "visionOS Engineer Meetup" GitHub: satoshi0212 X: @shmdevelop

10.

visionOS 30 Days Challenge

11.

Image Board 5x5 3 lines static pictures tap to dynamic motion

12.

Street View Images from Google StreetView API Combine images to a panoramic image covering 360°

13.

Metal Shader Quote from ShaderToy Use CAMetalLayer, CADisplayLink visionOS doesn't support MTKView visionOS 2 has LowLevelTexture

14.

https://www.amazon.co.jp/dp/4297143119/

15.

GUCCI のような表現と機能をどう作るか

16.

image from "GUCCI" application

17.

image from "GUCCI" application

18.

空間的表現 2D動画 コントローラー

19.

image from "GUCCI" application

20.

image from "GUCCI" application

21.

image from "GUCCI" application

22.

image from "GUCCI" application

23.

image from "GUCCI" application

24.

image from "GUCCI" application

25.

image from "GUCCI" application

26.

image from "GUCCI" application

27.

今回作るもの

28.

線状パーティクル 雨パーティクル + 黒背景 + 花火 Environment(アプリ内)

29.

Metadata付きHLSを作成し配置する 空間演出を作成する 動画再生する 空間演出を表示する 応用: 外部操作から演出を制御する

30.

Metadata付きHLSを作成し配置する 空間演出を作成する 動画再生する 空間演出を表示する 応用: 外部操作から演出を制御する

31.

今回の方式 HLSの指定時間に演出きっかけのtag埋め込み

32.

他にも アプリ側で動画秒数で演出管理 別途切り出した設定ファイルで演出管理 等の手法がある

33.

元となる動画を作成

36.

ID3 TagとMacroファイルを作成

37.

https://developer.apple.com/streaming/

38.

https://developer.apple.com/documentation/http-live-streaming/using-apple-s-http-live-streaming-hls-tools

39.

ID3 Tag Generator

40.

$ id3taggenerator -o reset.id3 -t "c̲reset" -o | -output-file <file> Specifies the path where the generated ID3 tag is written. -t | -text <string> Inserts a text frame with the given string.

41.

fi fi ff ff ff ff ff ff ff ff fi fi $ id3taggenerator -o reset.id3 -t "c̲reset" $ id3taggenerator -o line̲on.id3 -t "c̲on̲line̲particle" $ id3taggenerator -o line̲o .id3 -t "c̲o ̲line̲particle" $ id3taggenerator -o rain̲on.id3 -t "c̲on̲rain̲particle" $ id3taggenerator -o rain̲o .id3 -t "c̲o ̲rain̲particle" $ id3taggenerator -o reworks̲on.id3 -t "c̲on̲ reworks̲particle" $ id3taggenerator -o reworks̲o .id3 -t "c̲o ̲ reworks̲particle" $ id3taggenerator -o env̲01̲on.id3 -t "c̲on̲env̲01" $ id3taggenerator -o env̲01̲o .id3 -t "c̲o ̲env̲01"

43.

Macro.txtを作成

44.

Macro.txt ff ff ff ff fi fi 0 id3 ./reset.id3 2 id3 ./line̲on.id3 10 id3 ./line̲o .id3 11.5 id3 ./env̲01̲on.id3 20.5 id3 ./env̲01̲o .id3 21 id3 ./rain̲on.id3 30 id3 ./rain̲o .id3 32 id3 ./ reworks̲on.id3 40 id3 ./ reworks̲o .id3 44 id3 ./reset.id3

45.

Media File SegmenterでHLSリソース生成

46.

Media File Segmenter

47.

$ media lesegmenter -f ./output/ -i index.m3u8 -B media- -t 1 \ -M ./macro.txt ./SpatialE ects001.mov -f | - le-base path Directory to store the media and index -i | -index- le les. leName This option de nes the index le name. The default is prog̲index.m3u8. It is recommended that the index le have an extension of .m3u8 or .m3u. -B | -base-media- le-name name This option de nes the base name of the media les. The default is leSequence. The current sequence number of the is appended, and an extension added. For example, specifying name as AppleMediaFile will generate le names that look like AppleMediaFile12.ts. -t | -target-duration duration Speci es a target duration for the media at the PTS/DTS in the source le. fi fi fi ff fi fi fi fi le to be used to insert timed metadata into the stream. fi fi fi fi fi fi fi fi fi fi fi fi Speci es the macro fi les. The default duration is 10 seconds. The duration is calculated by looking le fi -M | -meta-macro- le le

49.

GitHub Pagesでホストする (開発用)

53.

VLC (動画Player)

55.

👍

56.

Metadata付きHLSを作成し配置する 空間演出を作成する 動画再生する 空間演出を表示する 応用: 外部操作から演出を制御する

57.

Metadata付きHLSを作成し配置する 空間演出を作成する 動画再生する 空間演出を表示する 応用: 外部操作から演出を制御する

58.

Reality Composer Pro

64.

線状のParticle

66.

線状のParticle

67.

線状のParticle

68.

雨と黒背景

72.

花火

76.

Environment

79.

https://developer.apple.com/documentation/realitykit/construct-an-immersive-environment-for-visionos

80.

Metadata付きHLSを作成し配置する 空間演出を作成する 動画再生する 空間演出を表示する 応用: 外部操作から演出を制御する

81.

Metadata付きHLSを作成し配置する 空間演出を作成する 動画再生する 空間演出を表示する 応用: 外部操作から演出を制御する

84.

SpatialEffectsVideoPlayerApp.swift @main struct SpatialEffectsVideoPlayerApp: App { @State private var appModel = AppModel() @State private var playerViewModel = AVPlayerViewModel() @State private var surroundingsEffect: SurroundingsEffect? = .semiDark var body: some Scene { WindowGroup { if playerViewModel.isPlaying { AVPlayerView(viewModel: playerViewModel) } else { ContentView() .environment(appModel) } } .windowResizability(.contentSize) .windowStyle(.plain) ImmersiveSpace(id: appModel.immersiveSpaceID) { ImmersiveView() .environment(appModel) .environment(playerViewModel) .onAppear { appModel.immersiveSpaceState = .open } .onDisappear { appModel.immersiveSpaceState = .closed } .preferredSurroundingsEffect(surroundingsEffect) } .immersionStyle(selection: .constant(.mixed), in: .mixed) } }

85.

SpatialEffectsVideoPlayerApp.swift @main struct SpatialEffectsVideoPlayerApp: App { @State private var appModel = AppModel() @State private var playerViewModel = AVPlayerViewModel() @State private var surroundingsEffect: SurroundingsEffect? = .semiDark var body: some Scene { WindowGroup { if playerViewModel.isPlaying { AVPlayerView(viewModel: playerViewModel) } else { ContentView() .environment(appModel) } } .windowResizability(.contentSize) .windowStyle(.plain) ImmersiveSpace(id: appModel.immersiveSpaceID) { ImmersiveView() .environment(appModel) .environment(playerViewModel) .onAppear { appModel.immersiveSpaceState = .open } .onDisappear { appModel.immersiveSpaceState = .closed } .preferredSurroundingsEffect(surroundingsEffect) } .immersionStyle(selection: .constant(.mixed), in: .mixed) } }

86.
[beta]
AVPlayerView.swift
import SwiftUI
struct AVPlayerView: UIViewControllerRepresentable {
let viewModel: AVPlayerViewModel
func makeUIViewController(context: Context) -> some UIViewController {
return viewModel.makePlayerViewController()
}
func updateUIViewController(_ uiViewController: UIViewControllerType, context: Context) {
// Update the AVPlayerViewController as needed
}
}

87.
[beta]
AVPlayerViewModel.swift
@Observable
final class AVPlayerViewModel: NSObject {
private(set) var isPlaying: Bool = false
private var avPlayerViewController: AVPlayerViewController?
private var avPlayer = AVPlayer()
private let videoURL: URL? = {
URL(string: "https://satoshi0212.github.io/hls/resources/index.m3u8")
}()
func makePlayerViewController() -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = avPlayer
controller.delegate = self
self.avPlayerViewController = controller
self.avPlayerViewController?.delegate = self
controller.modalPresentationStyle = .fullScreen
return controller
}
func play() {
guard !isPlaying, let videoURL else { return }
isPlaying = true
let item = AVPlayerItem(url: videoURL)
let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
metadataOutput.setDelegate(self, queue: DispatchQueue.main)
item.add(metadataOutput)
avPlayer.replaceCurrentItem(with: item)
avPlayer.play()
}
func reset() {
guard isPlaying else { return }
isPlaying = false
avPlayer.replaceCurrentItem(with: nil)
}
}

88.
[beta]
AVPlayerViewModel.swift
@Observable
final class AVPlayerViewModel: NSObject {
private(set) var isPlaying: Bool = false
private var avPlayerViewController: AVPlayerViewController?
private var avPlayer = AVPlayer()
private let videoURL: URL? = {
URL(string: "https://satoshi0212.github.io/hls/resources/index.m3u8")
}()
func makePlayerViewController() -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = avPlayer
controller.delegate = self
self.avPlayerViewController = controller
self.avPlayerViewController?.delegate = self
controller.modalPresentationStyle = .fullScreen
return controller
}
func play() {
guard !isPlaying, let videoURL else { return }
isPlaying = true
let item = AVPlayerItem(url: videoURL)
let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
metadataOutput.setDelegate(self, queue: DispatchQueue.main)
item.add(metadataOutput)
avPlayer.replaceCurrentItem(with: item)
avPlayer.play()
}
func reset() {
guard isPlaying else { return }
isPlaying = false
avPlayer.replaceCurrentItem(with: nil)
}
}

89.
[beta]
AVPlayerViewModel.swift
@Observable
final class AVPlayerViewModel: NSObject {
private(set) var isPlaying: Bool = false
private var avPlayerViewController: AVPlayerViewController?
private var avPlayer = AVPlayer()
private let videoURL: URL? = {
URL(string: "https://satoshi0212.github.io/hls/resources/index.m3u8")
}()
func makePlayerViewController() -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = avPlayer
controller.delegate = self
self.avPlayerViewController = controller
self.avPlayerViewController?.delegate = self
controller.modalPresentationStyle = .fullScreen
return controller
}
func play() {
guard !isPlaying, let videoURL else { return }
isPlaying = true
let item = AVPlayerItem(url: videoURL)
let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
metadataOutput.setDelegate(self, queue: DispatchQueue.main)
item.add(metadataOutput)
avPlayer.replaceCurrentItem(with: item)
avPlayer.play()
}
func reset() {
guard isPlaying else { return }
isPlaying = false
avPlayer.replaceCurrentItem(with: nil)
}
}

90.
[beta]
ImmersiveView.swift (抜粋)
struct ImmersiveView: View {
@Environment(AVPlayerViewModel.self) private var playerViewModel
@State var immersiveViewModel = ImmersiveViewModel()
var body: some View {
ZStack {
RealityView { content in
let entity = Entity()
content.add(entity)
immersiveViewModel.setup(entity: entity)
}
.gesture(SpatialTapGesture().targetedToAnyEntity()
.onEnded { value in
if value.entity.name == "StartButton" {
playerViewModel.play()
}
}
)
.onChange(of: playerViewModel.isPlaying, initial: false) { _, newValue in
immersiveViewModel.rootEntity?.getFirstChildByName(name: "StartButton")?.isEnabled = !newValue
}
.onDisappear {
playerViewModel.reset()
}
.transition(.opacity)
...
}
}
}

91.
[beta]
ImmersiveView.swift (抜粋)
struct ImmersiveView: View {
@Environment(AVPlayerViewModel.self) private var playerViewModel
@State var immersiveViewModel = ImmersiveViewModel()
var body: some View {
ZStack {
RealityView { content in
let entity = Entity()
content.add(entity)
immersiveViewModel.setup(entity: entity)
}
.gesture(SpatialTapGesture().targetedToAnyEntity()
.onEnded { value in
if value.entity.name == "StartButton" {
playerViewModel.play()
}
}
)
.onChange(of: playerViewModel.isPlaying, initial: false) { _, newValue in
immersiveViewModel.rootEntity?.getFirstChildByName(name: "StartButton")?.isEnabled = !newValue
}
.onDisappear {
playerViewModel.reset()
}
.transition(.opacity)
...
}
}
}

93.

Metadata受信を確認

94.

AVPlayerViewModel.swift extension AVPlayerViewModel: AVPlayerItemMetadataOutputPushDelegate { func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) { if let item = groups.first?.items.first, let metadataValue = item.value(forKey: "value") as? String { print("Metadata value: \(metadataValue)") // videoAction = VideoAction(rawValue: metadataValue) ?? .none } } }

96.

🎉

97.

Metadata付きHLSを作成し配置する 空間演出を作成する 動画再生する 空間演出を表示する 応用: 外部操作から演出を制御する

98.

Metadata付きHLSを作成し配置する 空間演出を作成する 動画再生する 空間演出を表示する 応用: 外部操作から演出を制御する

101.

演出系のViewは共通した構造にした

102.

LineParticleView.swift import SwiftUI import RealityKit struct LineParticleView: View { static let viewName = "LineParticleView" @State var viewModel = LineParticleViewModel() var body: some View { RealityView { content in let entity = Entity() content.add(entity) viewModel.setup(entity: entity) } } }

103.

RainParticleView.swift import SwiftUI import RealityKit struct RainParticleView: View { static let viewName = "RainParticleView" @State var viewModel = RainParticleViewModel() var body: some View { RealityView { content in let entity = Entity() content.add(entity) viewModel.setup(entity: entity) } } }

104.

FireworksParticleView.swift import SwiftUI import RealityKit struct FireworksParticleView: View { static let viewName = "FireworksParticleView" @State var viewModel = FireworksParticleViewModel() var body: some View { RealityView { content in let entity = Entity() content.add(entity) viewModel.setup(entity: entity) } } }

105.

Env01View.swift import SwiftUI import RealityKit struct Env01View: View { static let viewName = "Env01View" @State var viewModel = Env01ViewModel() var body: some View { RealityView { content in let entity = Entity() content.add(entity) viewModel.setup(entity: entity) } } }

106.

ViewModelも基本的に共通した構造

107.
[beta]
LineParticleViewModel.swift
import RealityKit
import Observation
import RealityKitContent
@MainActor
@Observable
final class LineParticleViewModel: LiveSequenceOperation {
private var rootEntity: Entity?
func setup(entity: Entity) {
rootEntity = entity
rootEntity?.opacity = 0.0
Task {
guard let scene = try? await Entity(named: "LineParticle", in: realityKitContentBundle),
let particleEntity = scene.findEntity(named: "ParticleEmitter")
else { return }
particleEntity.name = "lineParticle"
particleEntity.position = [0.0, 1.2, -0.8]
rootEntity?.addChild(particleEntity)
}
}
...

108.
[beta]
LineParticleViewModel.swift
import RealityKit
import Observation
import RealityKitContent
@MainActor
@Observable
final class LineParticleViewModel: LiveSequenceOperation {
private var rootEntity: Entity?
func setup(entity: Entity) {
rootEntity = entity
rootEntity?.opacity = 0.0
Task {
guard let scene = try? await Entity(named: "LineParticle", in: realityKitContentBundle),
let particleEntity = scene.findEntity(named: "ParticleEmitter")
else { return }
particleEntity.name = "lineParticle"
particleEntity.position = [0.0, 1.2, -0.8]
rootEntity?.addChild(particleEntity)
}
}
...

109.
[beta]
LineParticleViewModel.swift
import RealityKit
import Observation
import RealityKitContent
@MainActor
@Observable
final class LineParticleViewModel: LiveSequenceOperation {
private var rootEntity: Entity?
func setup(entity: Entity) {
rootEntity = entity
rootEntity?.opacity = 0.0
Task {
guard let scene = try? await Entity(named: "LineParticle", in: realityKitContentBundle),
let particleEntity = scene.findEntity(named: "ParticleEmitter")
else { return }
particleEntity.name = "lineParticle"
particleEntity.position = [0.0, 1.2, -0.8]
rootEntity?.addChild(particleEntity)
}
}
...

110.
[beta]
LineParticleViewModel.swift
import RealityKit
import Observation
import RealityKitContent
@MainActor
@Observable
final class LineParticleViewModel: LiveSequenceOperation {
private var rootEntity: Entity?
func setup(entity: Entity) {
rootEntity = entity
rootEntity?.opacity = 0.0
Task {
guard let scene = try? await Entity(named: "LineParticle", in: realityKitContentBundle),
let particleEntity = scene.findEntity(named: "ParticleEmitter")
else { return }
particleEntity.name = "lineParticle"
particleEntity.position = [0.0, 1.2, -0.8]
rootEntity?.addChild(particleEntity)
}
}
...

111.

protocol LiveSequenceOperation { func reset() async func play() async func fadeIn() async func fadeOut() async }

112.
[beta]
LineParticleViewModel.swift
import RealityKit
import Observation
import RealityKitContent
@MainActor
@Observable
final class LineParticleViewModel: LiveSequenceOperation {
private var rootEntity: Entity?
func setup(entity: Entity) {
rootEntity = entity
rootEntity?.opacity = 0.0
Task {
guard let scene = try? await Entity(named: "LineParticle", in: realityKitContentBundle),
let particleEntity = scene.findEntity(named: "ParticleEmitter")
else { return }
particleEntity.name = "lineParticle"
particleEntity.position = [0.0, 1.2, -0.8]
rootEntity?.addChild(particleEntity)
}
}
...

113.

LineParticleViewModel.swift ... func reset() { rootEntity?.opacity = 0.0 } func play() { rootEntity?.getFirstChildByName(name: "lineParticle")?.isEnabled = true } func fadeIn() { Task { await rootEntity?.setOpacity(1.0, animated: true, duration: 0.4) } } func fadeOut() { Task { await rootEntity?.setOpacity(0.0, animated: true, duration: 0.4) } } }

114.

ここで フェードイン、フェードアウトについて

115.

LineParticleViewModel.swift ... func reset() { rootEntity?.opacity = 0.0 } func play() { rootEntity?.getFirstChildByName(name: "lineParticle")?.isEnabled = true } func fadeIn() { Task { await rootEntity?.setOpacity(1.0, animated: true, duration: 0.4) } } func fadeOut() { Task { await rootEntity?.setOpacity(0.0, animated: true, duration: 0.4) } } }

116.
[beta]
Entity+.swift
@MainActor
func setOpacity(_ opacity: Float, animated: Bool, duration: TimeInterval = 0.2,
delay: TimeInterval = 0, completion: (() -> Void) = {}) async {
guard animated, let scene else {
self.opacity = opacity
return
}
if !components.has(OpacityComponent.self) {
components[OpacityComponent.self] = OpacityComponent(opacity: 1.0)
}
let animation = FromToByAnimation(name: "Entity/setOpacity", to: opacity, duration: duration,
timing: .linear, isAdditive: false, bindTarget: .opacity, delay: delay)
do {
let animationResource: AnimationResource = try .generate(with: animation)
let animationPlaybackController = playAnimation(animationResource)
let filtered = scene.publisher(for: AnimationEvents.PlaybackTerminated.self)
.filter { $0.playbackController == animationPlaybackController }
_ = filtered.values.filter { await $0.playbackController.isComplete }
completion()
} catch {
print("Could not generate animation: \(error.localizedDescription)")
}
}

117.
[beta]
Entity+.swift
@MainActor
func setOpacity(_ opacity: Float, animated: Bool, duration: TimeInterval = 0.2,
delay: TimeInterval = 0, completion: (() -> Void) = {}) async {
guard animated, let scene else {
self.opacity = opacity
return
}
if !components.has(OpacityComponent.self) {
components[OpacityComponent.self] = OpacityComponent(opacity: 1.0)
}
let animation = FromToByAnimation(name: "Entity/setOpacity", to: opacity, duration: duration,
timing: .linear, isAdditive: false, bindTarget: .opacity, delay: delay)
do {
let animationResource: AnimationResource = try .generate(with: animation)
let animationPlaybackController = playAnimation(animationResource)
let filtered = scene.publisher(for: AnimationEvents.PlaybackTerminated.self)
.filter { $0.playbackController == animationPlaybackController }
_ = filtered.values.filter { await $0.playbackController.isComplete }
completion()
} catch {
print("Could not generate animation: \(error.localizedDescription)")
}
}

118.

他のViewModel

119.
[beta]
RainParticleViewModel.swift
@MainActor
@Observable
final class RainParticleViewModel: LiveSequenceOperation {
private var rootEntity: Entity?
func setup(entity: Entity) {
rootEntity = entity
rootEntity.opacity = 0.0
let skyBoxEntity = Entity()
skyBoxEntity.components.set(ModelComponent(
mesh: .generateSphere(radius: 1000),
materials: [UnlitMaterial(color: .black)]
))
skyBoxEntity.scale *= .init(x: -1, y: 1, z: 1)
rootEntity.addChild(skyBoxEntity)
Task {
if let scene = try? await Entity(named: "RainParticle", in: realityKitContentBundle) {
let particleEntity = scene.findEntity(named: "ParticleEmitter")!
particleEntity.name = "rainParticle"
particleEntity.position = [0.0, 3.0, -2.0]
rootEntity.addChild(particleEntity)
}
}
}
...

120.

LineParticleViewModel.swift ... func reset() { rootEntity?.opacity = 0.0 } func play() { rootEntity?.getFirstChildByName(name: "rainParticle")?.isEnabled = true } func fadeIn() { Task { await rootEntity?.setOpacity(1.0, animated: true, duration: 1.4) } } func fadeOut() { Task { await rootEntity?.setOpacity(0.0, animated: true, duration: 1.4) } } }

121.

FireworksParticleViewModel.swift @MainActor @Observable final class FireworksParticleViewModel: LiveSequenceOperation { private var rootEntity: Entity? func setup(entity: Entity) { rootEntity = entity rootEntity?.opacity = 0.0 Task { guard let scene = try? await Entity(named: "Fireworks", in: realityKitContentBundle) else { return } rootEntity?.addChild(scene) } } ... }

122.

Env01ViewModel.swift @MainActor @Observable final class Env01ViewModel: LiveSequenceOperation { private var rootEntity: Entity? func setup(entity: Entity) { rootEntity = entity rootEntity?.opacity = 0.0 Task { guard let scene = try? await Entity(named: "Env_01", in: realityKitContentBundle) else { return } rootEntity?.addChild(scene) } } ... }

123.

Viewを生成し配置

125.

ImmersiveViewModel.swift (抜粋) @MainActor @Observable class ImmersiveViewModel { private(set) var rootEntity: Entity? let lineParticleView: LineParticleView = .init() let rainParticleView: RainParticleView = .init() let fireworksParticleView: FireworksParticleView = .init() let env01View: Env01View = .init() @ObservationIgnored private lazy var effectViewModels: [String : LiveSequenceOperation] = { return [ LineParticleView.viewName : self.lineParticleView.viewModel, RainParticleView.viewName : self.rainParticleView.viewModel, FireworksParticleView.viewName : self.fireworksParticleView.viewModel, Env01View.viewName : self.env01View.viewModel, ] }() ...

126.
[beta]
ImmersiveViewModel.swift (抜粋)
struct ImmersiveView: View {
@State var immersiveViewModel = ImmersiveViewModel()
var body: some View {
ZStack {
RealityView { content in
let entity = Entity()
content.add(entity)
immersiveViewModel.setup(entity: entity)
}
.gesture(SpatialTapGesture().targetedToAnyEntity()
.onEnded { value in
if value.entity.name == "StartButton" {
playerViewModel.play()
}
}
)
.onChange(of: playerViewModel.videoAction, initial: true) { oldValue, newValue in
immersiveViewModel.processVideoAction(oldValue: oldValue, newValue: newValue)
}
.onChange(of: playerViewModel.isPlaying, initial: false) { _, newValue in
immersiveViewModel.rootEntity?.getFirstChildByName(name: "StartButton")?.isEnabled = !newValue
}
.onDisappear {
playerViewModel.reset()
}
.transition(.opacity)

}

}

}

// place effect views
immersiveViewModel.lineParticleView
immersiveViewModel.rainParticleView
immersiveViewModel.fireworksParticleView
immersiveViewModel.env01View

127.
[beta]
ImmersiveView.swift (抜粋)
struct ImmersiveView: View {
@State var immersiveViewModel = ImmersiveViewModel()
var body: some View {
ZStack {
RealityView { content in
let entity = Entity()
content.add(entity)
immersiveViewModel.setup(entity: entity)
}
.gesture(SpatialTapGesture().targetedToAnyEntity()
.onEnded { value in
if value.entity.name == "StartButton" {
playerViewModel.play()
}
}
)
.onChange(of: playerViewModel.videoAction, initial: true) { oldValue, newValue in
immersiveViewModel.processVideoAction(oldValue: oldValue, newValue: newValue)
}
.onChange(of: playerViewModel.isPlaying, initial: false) { _, newValue in
immersiveViewModel.rootEntity?.getFirstChildByName(name: "StartButton")?.isEnabled = !newValue
}
.onDisappear {
playerViewModel.reset()
}
.transition(.opacity)

}

}

}

// place effect views
immersiveViewModel.lineParticleView
immersiveViewModel.rainParticleView
immersiveViewModel.fireworksParticleView
immersiveViewModel.env01View

128.

View、ViewModelの作成と配置ができた

129.

Metadataのtagに応じてViewを表示

130.

enum VideoAction: String { case none case c_reset case c_on_line_particle case c_off_line_particle case c_on_rain_particle case c_off_rain_particle case c_on_fireworks_particle case c_off_fireworks_particle case c_on_env_01 case c_off_env_01 }

131.

AVPlayerViewModel.swift extension AVPlayerViewModel: AVPlayerItemMetadataOutputPushDelegate { func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) { if let item = groups.first?.items.first, let metadataValue = item.value(forKey: "value") as? String { print("Metadata value: \(metadataValue)") videoAction = VideoAction(rawValue: metadataValue) ?? .none } } }

132.

AVPlayerViewModel.swift extension AVPlayerViewModel: AVPlayerItemMetadataOutputPushDelegate { func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) { if let item = groups.first?.items.first, let metadataValue = item.value(forKey: "value") as? String { print("Metadata value: \(metadataValue)") videoAction = VideoAction(rawValue: metadataValue) ?? .none } } }

133.
[beta]
ImmersiveView.swift (抜粋)
struct ImmersiveView: View {
@State var immersiveViewModel = ImmersiveViewModel()
var body: some View {
ZStack {
RealityView { content in
let entity = Entity()
content.add(entity)
immersiveViewModel.setup(entity: entity)
}
.gesture(SpatialTapGesture().targetedToAnyEntity()
.onEnded { value in
if value.entity.name == "StartButton" {
playerViewModel.play()
}
}
)
.onChange(of: playerViewModel.videoAction, initial: true) { oldValue, newValue in
immersiveViewModel.processVideoAction(oldValue: oldValue, newValue: newValue)
}
.onChange(of: playerViewModel.isPlaying, initial: false) { _, newValue in
immersiveViewModel.rootEntity?.getFirstChildByName(name: "StartButton")?.isEnabled = !newValue
}
.onDisappear {
playerViewModel.reset()
}
.transition(.opacity)

}

}

}

// place effect views
immersiveViewModel.lineParticleView
immersiveViewModel.rainParticleView
immersiveViewModel.fireworksParticleView
immersiveViewModel.env01View

134.

ImmersiveViewModel.swift func processVideoAction(oldValue: VideoAction = .none, newValue: VideoAction = .none) { // avoid continuous firing of actions other than reset action if newValue != .c_reset && oldValue == newValue { return } switch newValue { case .none: break case .c_reset: resetAction() case .c_on_line_particle: Task { await play(viewName: LineParticleView.viewName) await fadeIn(viewName: LineParticleView.viewName) } case .c_off_line_particle: Task { await fadeOut(viewName: LineParticleView.viewName) } case .c_on_rain_particle: ...

135.

ImmersiveViewModel.swift func processVideoAction(oldValue: VideoAction = .none, newValue: VideoAction = .none) { // avoid continuous firing of actions other than reset action if newValue != .c_reset && oldValue == newValue { return } switch newValue { case .none: break case .c_reset: resetAction() case .c_on_line_particle: Task { await play(viewName: LineParticleView.viewName) await fadeIn(viewName: LineParticleView.viewName) } case .c_off_line_particle: Task { await fadeOut(viewName: LineParticleView.viewName) } case .c_on_rain_particle: ...

136.

ImmersiveViewModel.swift func processVideoAction(oldValue: VideoAction = .none, newValue: VideoAction = .none) { // avoid continuous firing of actions other than reset action if newValue != .c_reset && oldValue == newValue { return } switch newValue { case .none: break case .c_reset: resetAction() case .c_on_line_particle: Task { await play(viewName: LineParticleView.viewName) await fadeIn(viewName: LineParticleView.viewName) } case .c_off_line_particle: Task { await fadeOut(viewName: LineParticleView.viewName) } case .c_on_rain_particle: Task { await play(viewName: RainParticleView.viewName) await fadeIn(viewName: RainParticleView.viewName) } case .c_off_rain_particle: Task { await fadeOut(viewName: RainParticleView.viewName) } case .c_on_fireworks_particle: Task { await play(viewName: FireworksParticleView.viewName) await fadeIn(viewName: FireworksParticleView.viewName) } case .c_off_fireworks_particle: Task { await fadeOut(viewName: FireworksParticleView.viewName) } case .c_on_env_01: Task { await fadeIn(viewName: Env01View.viewName) } case .c_off_env_01: Task { await fadeOut(viewName: Env01View.viewName) } } }

137.

これで全部つながった

139.

🎉

140.

Metadata付きHLSを作成し配置する 空間演出を作成する 動画再生する 空間演出を表示する 応用: 外部操作から演出を制御する

141.

Metadata付きHLSを作成し配置する 空間演出を作成する 動画再生する 空間演出を表示する 応用: 外部操作から演出を制御する

142.

OSCで外部から制御する

143.

https://www.elgato.com/jp/ja/p/stream-deck-mk2-black

145.

ImmersiveViewModel.swift import OSCKit class ImmersiveViewModel { private let oscClient = OSCClient() private let oscServer = OSCServer(port: 55535) private let addressSpace = OSCAddressSpace() func setup(entity: Entity) { rootEntity = entity ... setupOSC() } // MARK: - OSC private func setupOSC() { ... } }

146.

ImmersiveViewModel.swift private func setupOSC() { ... addressSpace.register(localAddress: "/line_on") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_on_line_particle) } addressSpace.register(localAddress: "/line_off") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_off_line_particle) } oscServer.setHandler { [weak self] message, timeTag in do { try self?.handle(message: message, timeTag: timeTag) } catch { print(error) } } do { try oscServer.start() } catch { print(error) } } private func handle(message: OSCMessage, timeTag: OSCTimeTag) throws { let methodIDs = addressSpace.dispatch(message) if methodIDs.isEmpty { print("No method registered for:", message) } }

147.

ImmersiveViewModel.swift private func setupOSC() { addressSpace.register(localAddress: "/reset") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_reset) } addressSpace.register(localAddress: "/line_on") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_on_line_particle) } addressSpace.register(localAddress: "/line_off") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_off_line_particle) } addressSpace.register(localAddress: "/rain_on") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_on_rain_particle) } addressSpace.register(localAddress: "/rain_off") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_off_rain_particle) } addressSpace.register(localAddress: "/fireworks_on") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_on_fireworks_particle) } addressSpace.register(localAddress: "/fireworks_off") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_off_fireworks_particle) } addressSpace.register(localAddress: "/env_01_on") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_on_env_01) } addressSpace.register(localAddress: "/env_01_off") { [weak self] _ in guard let self else { return } self.processVideoAction(newValue: .c_off_env_01) } oscServer.setHandler { [weak self] message, timeTag in do { try self?.handle( message: message, timeTag: timeTag ) } catch { print(error) } } do { } try oscServer.start() } catch { print(error) }

148.

Stream Deck 設定アプリ

149.

Stream Deck 設定アプリ

150.

Stream Deck 設定アプリ

151.

Demo Video

152.

Demo

153.

Demo Video

154.

Wrap up 近日公開予定 Xcode16以降 visionOSシミュレータでも動作します Spatial E ects VideoPlayer HLS ID3 tag埋め込み コマンドファイル HLS Downloader HLS tag Viewer Touch Designer: OSC̲Dispatcher StreamDeck: pro le https://github.com/satoshi0212/visionOS_30Days fi ff https://github.com/satoshi0212/visionOS_2_30Days