Photo by Thomas William on Unsplash

In the previous post, we learned how to connect input and output and start a session. In this post, we will take the preview screen from the session and output it to the View in SwiftUI.

1. Load Preview Layer

let session = AVCaptureSession()
let previewLayer = AVCaptureVideoPreviewLayer(session: session)

// Done...!

The preview layer can be easily created. You just need to pass an AVCaptureSession instance to the constructor of AVCaptureVideoPreviewLayer.

  • For UIKit, everything ends here. This is because it’s complete after adding a sublayer to UIView and adjusting the layout.
  • However, SwiftUI is not the same. We need to use the UIViewControllerRepresentable protocol to convert it for use in the SwiftUI's View context.

2. Transfer layer to View

struct Preview: UIViewControllerRepresentable {
let previewLayer: AVCaptureVideoPreviewLayer
let gravity: AVLayerVideoGravity

init(
with session: AVCaptureSession,
gravity: AVLayerVideoGravity
) {
self.gravity = gravity
self.previewLayer = AVCaptureVideoPreviewLayer(session: session)
}

func makeUIViewController(context: Context) -> UIViewController {
let viewController = UIViewController()
return viewController
}

func updateUIViewController(_ uiViewController: UIViewController, context: Context) {
previewLayer.videoGravity = gravity
uiViewController.view.layer.addSublayer(previewLayer)

previewLayer.frame = uiViewController.view.bounds
}

func dismantleUIViewController(_ uiViewController: UIViewController, coordinator: ()) {
previewLayer.removeFromSuperlayer()
}
}

Create a UIKit’s UIViewController using UIViewControllerRepresentable.

  • After passing the preview layer, add the layer to the root view of the view controller.
  • Then, using the override method(viewWillLayoutSubviews) of UIViewController, resize the layer according to the changing size of the view. Without going through this process, the size of the layer becomes 0 and the preview does not appear.

3. Display in View

class VideoContentViewModel: NSObject, ObservableObject {
let session: AVCaptureSession
@Published var preview: Preview?

override init() {
self.session = AVCaptureSession()

super.init()

Task(priority: .background) {
switch await AuthorizationChecker.checkCaptureAuthorizationStatus() {
case .permitted:
try session
.addMovieInput()
.addMovieFileOutput()
.startRunning()

DispatchQueue.main.async {
self.preview = Preview(session: self.session, gravity: .resizeAspectFill)
}

case .notPermitted:
break
}
}
}

...
}
struct VideoContentView: View {
@StateObject var viewModel = VideoContentViewModel()
var body: some View {
viewModel.preview?
.frame(minWidth: 0, maxWidth: .infinity, minHeight: 0, maxHeight: .infinity)
.edgesIgnoringSafeArea(.all)
}
}
  • Now, the preview layer can be used as if it’s a SwiftUI’s View! You can display the preview on a full screen by setting the frame to infinity and setting edgesIgnoringSafeArea to all.

Wrap up

We’re almost there! In the next post, we will learn how to start and stop recording, and if desired, how to convert and save video files to the album.

Advertisement..👋

Aespa: Easiest camera handling package ever for SwiftUI & UIKit

I’ve created a package that compresses the tedious process of connecting videos into just three lines.

It’s designed to be simple enough for someone who is developing for iOS for the first time and provides essential default settings for video recording. Of course, customization is possible if you want.

It offers beautiful documentation using Swift DoCC and a demo app that provides detailed implementation examples, so feel free to take a look if you’re interested. I would really appreciate it if you could give a star!

--

--