Skip to content

Upgrade Swift language version to 4.2 #306

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Package.swift
Original file line number Diff line number Diff line change
Expand Up @@ -56,5 +56,5 @@ let package = Package(
products: platformProducts,
dependencies: platformDependencies,
targets: platformTargets,
swiftLanguageVersions: [.v4]
swiftLanguageVersions: [.v4_2]
)
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,8 @@ Currently, GPUImage uses Lode Vandevenne's <a href="https://lodev.org/lodepng/">

## Technical requirements ##

- Swift 3
- Xcode 8.0 on Mac or iOS
- Swift 4.2 or higher
- Xcode 10.1 or higher on Mac or iOS
- iOS: 8.0 or higher (Swift is supported on 7.0, but not Mac-style frameworks)
- OSX: 10.9 or higher
- Linux: Wherever Swift code can be compiled. Currently, that's Ubuntu 14.04 or higher, along with the many other places it has been ported to. I've gotten this running on the latest Raspbian, for example. For camera input, Video4Linux needs to be installed.
Expand Down
4 changes: 2 additions & 2 deletions framework/Source/Apple/MovieInput.swift
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ public class MovieInput: ImageSource {
let playAtActualSpeed:Bool
let loop:Bool
var videoEncodingIsFinished = false
var previousFrameTime = kCMTimeZero
var previousFrameTime = CMTime.zero
var previousActualFrameTime = CFAbsoluteTimeGetCurrent()

var numberOfFramesCaptured = 0
Expand Down Expand Up @@ -55,7 +55,7 @@ public class MovieInput: ImageSource {
var readerVideoTrackOutput:AVAssetReaderOutput? = nil;

for output in self.assetReader.outputs {
if(output.mediaType == AVMediaType.video.rawValue) {
if(output.mediaType == .video) {
readerVideoTrackOutput = output;
}
}
Expand Down
10 changes: 5 additions & 5 deletions framework/Source/Apple/MovieOutput.swift
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ public class MovieOutput: ImageConsumer, AudioEncodingTarget {
private var videoEncodingIsFinished = false
private var audioEncodingIsFinished = false
private var startTime:CMTime?
private var previousFrameTime = kCMTimeNegativeInfinity
private var previousAudioTime = kCMTimeNegativeInfinity
private var previousFrameTime = CMTime.negativeInfinity
private var previousAudioTime = CMTime.negativeInfinity
private var encodingLiveVideo:Bool
var pixelBuffer:CVPixelBuffer? = nil
var renderFramebuffer:Framebuffer!
Expand All @@ -45,7 +45,7 @@ public class MovieOutput: ImageConsumer, AudioEncodingTarget {
self.size = size
assetWriter = try AVAssetWriter(url:URL, fileType:fileType)
// Set this to make sure that a functional movie is produced, even if the recording is cut off mid-stream. Only the last second should be lost in that case.
assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000)
assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, preferredTimescale: 1000)

var localSettings:[String:AnyObject]
if let settings = settings {
Expand All @@ -56,7 +56,7 @@ public class MovieOutput: ImageConsumer, AudioEncodingTarget {

localSettings[AVVideoWidthKey] = localSettings[AVVideoWidthKey] ?? NSNumber(value:size.width)
localSettings[AVVideoHeightKey] = localSettings[AVVideoHeightKey] ?? NSNumber(value:size.height)
localSettings[AVVideoCodecKey] = localSettings[AVVideoCodecKey] ?? AVVideoCodecH264 as NSString
localSettings[AVVideoCodecKey] = localSettings[AVVideoCodecKey] ?? AVVideoCodecType.h264 as NSString

assetWriterVideoInput = AVAssetWriterInput(mediaType:AVMediaType.video, outputSettings:localSettings)
assetWriterVideoInput.expectsMediaDataInRealTime = liveVideo
Expand Down Expand Up @@ -266,7 +266,7 @@ public extension Timestamp {

var asCMTime:CMTime {
get {
return CMTimeMakeWithEpoch(value, timescale, epoch)
return CMTimeMakeWithEpoch(value: value, timescale: timescale, epoch: epoch)
}
}
}
4 changes: 2 additions & 2 deletions framework/Source/Apple/PictureOutput.swift
Original file line number Diff line number Diff line change
Expand Up @@ -97,8 +97,8 @@ public class PictureOutput: ImageConsumer {
#if canImport(UIKit)
let image = UIImage(cgImage:cgImageFromBytes, scale:1.0, orientation:.up)
switch encodedImageFormat {
case .png: imageData = UIImagePNGRepresentation(image)! // TODO: Better error handling here
case .jpeg: imageData = UIImageJPEGRepresentation(image, 0.8)! // TODO: Be able to set image quality
case .png: imageData = image.pngData()! // TODO: Better error handling here
case .jpeg: imageData = image.jpegData(compressionQuality: 0.8)! // TODO: Be able to set image quality
}
#else
let bitmapRepresentation = NSBitmapImageRep(cgImage:cgImageFromBytes)
Expand Down