Chichibu is also famous for the animation “Anohana: The Flower We Saw That Day“. It was on air in 2011, but still may anime fans are visiting Chichibu, the place of anime scenes.
When I execute the previous article, TimeLapse, the following warnings appear in the terminal.
objc[3865]: Class QMacAutoReleasePoolTracker is implemented in both /usr/local/opt/qt/lib/QtCore.framework/Versions/5/QtCore (0x10a1e0a48) and /usr/local/lib/python3.8/site-packages/cv2/.dylibs/QtCore (0x10f5fe700). One of the two will be used. Which one is undefined.
objc[3865]: Class QT_ROOT_LEVEL_POOL__THESE_OBJECTS_WILL_BE_RELEASED_WHEN_QAPP_GOES_OUT_OF_SCOPE is implemented in both /usr/local/opt/qt/lib/QtCore.framework/Versions/5/QtCore (0x10a1e0ac0) and /usr/local/lib/python3.8/site-packages/cv2/.dylibs/QtCore (0x10f5fe778). One of the two will be used. Which one is undefined.
objc[3865]: Class KeyValueObserver is implemented in both /usr/local/opt/qt/lib/QtCore.framework/Versions/5/QtCore (0x10a1e0ae8) and /usr/local/lib/python3.8/site-packages/cv2/.dylibs/QtCore (0x10f5fe7a0). One of the two will be used. Which one is undefined.
objc[3865]: Class RunLoopModeTracker is implemented in both /usr/local/opt/qt/lib/QtCore.framework/Versions/5/QtCore (0x10a1e0b38) and /usr/local/lib/python3.8/site-packages/cv2/.dylibs/QtCore (0x10f5fe7f0). One of the two will be used. Which one is undefined.
Live Photosをキャプチャするには、キャプチャパイプラインの内部再構成が必要です。これには時間がかかり、進行中のキャプチャが中断されます。最初のライブ写真を撮影する前に、AVCapturePhotoOutputオブジェクトでライブ写真のキャプチャを有効にして、パイプラインが適切に構成されていることを確認してください。
let photoOutput = AVCapturePhotoOutput()
// Attempt to add the photo output to the session.
if captureSession.canAddOutput(photoOutput) {
captureSession.sessionPreset = .photo
captureSession.addOutput(photoOutput)
} else {
throw CameraError.configurationFailed
}
// Configure the photo output's behavior.
photoOutput.isHighResolutionCaptureEnabled = true
photoOutput.isLivePhotoCaptureEnabled = photoOutput.isLivePhotoCaptureSupported
// Start the capture session.
captureSession.startRunning()
let photoSettings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.hevc])
photoSettings.livePhotoMovieFileURL = // output url
// Shoot the Live Photo, using a custom class to handle capture delegate callbacks.
let captureProcessor = LivePhotoCaptureProcessor()
photoOutput.capturePhoto(with: photoSettings, delegate: captureProcessor)
func saveLivePhotoToPhotosLibrary(stillImageData: Data, livePhotoMovieURL: URL) { PHPhotoLibrary.requestAuthorization { status in
guard status == .authorized else { return }
PHPhotoLibrary.shared().performChanges({
// Add the captured photo's file data as the main resource for the Photos asset.
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .photo, data: stillImageData, options: nil)
// Add the movie file URL as the Live Photo's paired video resource.
let options = PHAssetResourceCreationOptions()
options.shouldMoveFile = true
creationRequest.addResource(with: .pairedVideo, fileURL: livePhotoMovieURL, options: options)
}) { success, error in
// Handle completion.
}
}
}
Live Photosをキャプチャすると、「写真キャプチャの進行状況の追跡」に示すプロセスに2つの追加ステップが追加されます。静止画の結果の配信(ステップ4)の後、写真出力はムービーキャプチャステータスを通知し(ステップ5)、ムービー結果を配信します(ステップ6) )。(最終クリーンアップはステップ7になります。)
class LivePhotoCaptureProcessor: NSObject, AVCapturePhotoCaptureDelegate {
// ... other PhotoCaptureDelegate methods and supporting properties ...
// A handler to call when Live Photo capture begins and ends.
var livePhotoStatusHandler: (Bool) -> () = { _ in }
// A property for tracking in-progress captures and updating UI accordingly.
var livePhotosInProgress = 0 {
didSet {
// Update the UI accordingly based on the value of this property
}
}
// Call the handler when PhotoCaptureDelegate methods indicate Live Photo capture is in progress.
func photoOutput(_ output: AVCapturePhotoOutput,
willBeginCaptureFor resolvedSettings: AVCaptureResolvedPhotoSettings) {
let capturingLivePhoto = (resolvedSettings.livePhotoMovieDimensions.width > 0 && resolvedSettings.livePhotoMovieDimensions.height > 0)
livePhotoStatusHandler(capturingLivePhoto)
}
func photoOutput(_ output: AVCapturePhotoOutput,
didFinishRecordingLivePhotoMovieForEventualFileAt outputFileURL: URL,
resolvedSettings: AVCaptureResolvedPhotoSettings) {
livePhotoStatusHandler(false)
}
}
PHPhotoLibrary.requestAuthorization { status in
guard status == .authorized else { return }
// Use PHPhotoLibrary.shared().performChanges(...) to add assets.
}
class PhotoCaptureProcessor: NSObject, AVCapturePhotoCaptureDelegate {
var completionHandler: () -> () = {}
func photoOutput(_ output: AVCapturePhotoOutput, didFinishCaptureFor resolvedSettings: AVCaptureResolvedPhotoSettings, error: Error?) {
completionHandler()
}
// ... other delegate methods to handle capture results...
}
// Keep a set of in-progress capture delegates.
var capturesInProgress = Set<PhotoCaptureProcessor>()
func shootPhoto() {
// Make a new capture delegate for each capture and add it to the set.
let captureProcessor = PhotoCaptureProcessor()
capturesInProgress.insert(captureProcessor)
// Schedule for the capture delegate to be removed from the set after capture.
captureProcessor.completionHandler = { [weak self] in
self?.capturesInProgress.remove(captureProcessor); return
}
self.photoOutput.capturePhoto(with: self.settingsForNextPhoto(), delegate: captureProcessor)
}
switch AVCaptureDevice.authorizationStatus(for: .video) {
case .authorized: // The user has previously granted access to the camera.
self.setupCaptureSession()
case .notDetermined: // The user has not yet been asked for camera access.
AVCaptureDevice.requestAccess(for: .video) { granted in
if granted {
self.setupCaptureSession()
}
}
case .denied: // The user has previously denied access.
return
case .restricted: // The user can't grant access due to restrictions.
return
}
class PreviewView: UIView {
override class var layerClass: AnyClass {
return AVCaptureVideoPreviewLayer.self
}
/// Convenience wrapper to get layer as its statically known type.
var videoPreviewLayer: AVCaptureVideoPreviewLayer {
return layer as! AVCaptureVideoPreviewLayer
}
}