objective-c,ios,avfoundationRelated issues-Collection of common programming errors
Victor Ronin
objective-c ios
I read Apple’s recommendation on exception usage and NSError usage:Also, I read several similar stack overflow questions which discuss whether to use or not exception.Exception Handeling in iOSUsing exceptions in Objective-CObjective-C ExceptionsI am trying to figure out pros and cons of usage exception as error notification/handling method in iOS (Frankly, I am no satisfied with Apple’s sentence (it says what to do, but it doesn’t say why we should do it):You should reserve the use of exceptio
Stephen Darlington
objective-c
I’m a new iPhone/Objective-C developer and as I’m going through different tutorials and open source code, I am having a bit of a problem understanding when to use the square brackets “[ ]” and when to use the period ” . ” for accessing properties/methods of an object.For example, this code:- (void)setSelected:(BOOL)selected animated:(BOOL)animated {[super setSelected:selected animated:animated];UIColor *backgroundColor = nil;if (selected){backgroundColor = [UIColor clearColor];} else {background
TheAmateurProgrammer
rmaddy
objective-c physics sprite-kit skphysicsbody
So I have an object that has a physicsBody and gravity affects it. It is also dynamic.Currently, when the users touches the screen, I run the code:applyForce(0, 400)The object moves up about 200 and then falls back down due to gravity. This only happens some of the time. Other times, it results in the object only moving 50ish units in the Y direction.I can’t find a pattern… I put my project on dropbox so it can be opened if anyone is willing to look at it.https://www.dropbox.com/sh/z0nt79pd0l5
Matt Wilding
objective-c ios nsurlconnection grand-central-dispatch
NSData has always had a very convenient method called +dataWithContentsOfURL:options:error:. While convenient, it also blocks execution of the current thread, which meant it was basically useless for production code (Ignoring NSOperation). I used this method so infrequently, I completely forgot that it existed. Until recently.The way I’ve been grabbing data from the tubes is the standard NSURLConnectionDelegate approach: Write a download class that handles the various NSURLConnectionDelegate met
aLFaRSi
objective-c xcode
i`m trying to make an NSMutableArray From NSUserDefaults so i can add/delete and edit it latermy code is- (void)viewDidLoad { [super viewDidLoad];NSUserDefaults *ArrayTable = [NSUserDefaults standardUserDefaults]; [ArrayTable setObject:@”One” forKey:@”myArray”]; [ArrayTable setObject:@”Two” forKey:@”myArray”]; [ArrayTable setObject:@”Three” forKey:@”myArray”]; [ArrayTable setObject:@”Four” forKey:@”myArray”]; [ArrayTable setObject:@”Five” forKey:@”myArray”]; [ArrayTable synchronize];NSMutableArr
Rob Caraway
ios objective-c xcode xcodebuild osx-server
I’m getting this error when trying to use an Xcode bot to integrate for unit tests:Integration failed. Unexpected error while building. See the integration’s logs for more details.I couldn’t make any sense of the logs. They pretty much spit out any error with the system whether related or not.I use a remote repository from from Github, testing currently on only 1 device. I’m .gitignore-ing the standard stuff from Xcode.Here is the most relevant messages from the log I could find:’2014-01-01 10
Hammerhead96
objective-c arrays ipad uitableview plist
I am working on a project for iPad using storyboard. I am trying to populate my UITableView with data from a plist. The Plist has an NSDictionary, which in turn has 9 Arrays, each with three elements (one number and two strings). I want to take the second element [1] from each array (which is a string) and populate the table with those strings.From header file;@interface EonOrderOfPrinciple : UIViewController <UITableViewDataSource, UITableViewDelegate> @property (strong, nonatomic) IBOutl
carexcer
ios objective-c cocoa-touch admob chartboost
so after a week of attempting to load chart boost to my non-arc iOS app in Xcode, I’m gonna start asking some really silly questions. my code in my appdelegate is: (void)applicationDidBecomeActive:(UIApplication *)application {[[CCDirector sharedDirector] resume];Chartboost *cb = [Chartboost sharedChartboost];cb.appId = @”530dd707f8975c182ae2c691″;cb.appSignature = @”0d8726e69c911a182b0cefac4eca36f692355725″;// Required for use of delegate methods. See “Advanced Topics” section below.cb.delegate
Ben Packard
objective-c cocoa nstimer
I have a snippet of code I want to execute repeatedly, but with the ability to pause and resume it. To do this, I have utilised an NSTimer, which I can stop and start as required.Within the snippet, I use a sleep command to wait for something to update (0.3 seconds). The timer is firing every 0.5 seconds.What would be ideal is to keep the stop and start functionality, and be firing every 0.3 seconds, but to not have to explicitly say I want it to fire every x seconds. The 0.5 is completely ar
Victor Ronin
objective-c ios
I read Apple’s recommendation on exception usage and NSError usage:Also, I read several similar stack overflow questions which discuss whether to use or not exception.Exception Handeling in iOSUsing exceptions in Objective-CObjective-C ExceptionsI am trying to figure out pros and cons of usage exception as error notification/handling method in iOS (Frankly, I am no satisfied with Apple’s sentence (it says what to do, but it doesn’t say why we should do it):You should reserve the use of exceptio
ikevinjp
iphone ios
What is the best practice to log errors/events in an iPhone application? I’m not talking about debugging, but after an app has been released. I mean, I’d like to collect errors/events logs when the app is running in released mode (not debug mode). (When needed I can ask the user to voluntarily send the file to my server for analysis.)(Does NSLog have any effect if it is not running in debug? If so, where does it write to? And, how to clear any contents programatically?)
shanegao
ios cocoapods
I just run pod install and got this message in terminal:CocoaPods 0.29.0 is available.2014-02-17 00:25:35.875 ruby[65690:507] CFPropertyListCreateFromXMLData(): Old-style plist parser: missing semicolon in dictionary on line 908. Parsing will be abandoned. Break on _CFPropertyListMissingSemicolon to debug.??? MARKDOWN TEMPLATE ???????????????????????????????????????????????????????????### Report* What did you do?* What did you expect to happen?* What happened instead?### Stack“`CocoaPods : 0.27
bkaid
ios facebook upload photo
I’d be grateful for any suggestions as to how to deal with a problem I’m having posting an image to Facebook.The following code works just as I would expect:NSMutableDictionary* params = [NSMutableDictionary dictionaryWithObjectsAndKeys:@”http://myserver/img.png”, @”picture”,@”My Testing”, @”name”,@”Enter some text…”, @”message”,nil];[appDelegate.facebook dialog:@”feed” andParams:params andDelegate:self];However, I really want to use an image that I capture from my camera. I’ve set up the cod
flexaddicted
ios ios5 automatic-ref-counting weak strong
This question already has an answer here:Should IBOutlets be strong or weak under ARC?6 answersI have switched my project to ARC, and I don’t understand if I have to use strong or weak for IBOutlets. Xcode do this: in interface builder, if a create a UILabel for example and I connect it with assistant editor to my ViewController, it create this:@property (nonatomic, strong) UILabel *aLabel;It uses the strong, instead I read a tutorial on RayWenderlich website that say this:But for these two part
mmorris
ios keyboard size uikeyboard nsnotificationcenter
The following code (sorry for the length) displays an odd behavior under iOS 4.3 (maybe others version too). In this example, there are three UITextFields that have three different sized keyboards. If you start editing one text field and then touch “return” dismissing the keyboard, each time the keyboard size is returned correctly in UIKeyboardWillShowNotification and UIKeyboardDidShowNotification using UIKeyboardFrameBeginUserInfoKey.see below:- (void) keyboardWillShowNotification:(NSNotifica
Matt Wilding
objective-c ios nsurlconnection grand-central-dispatch
NSData has always had a very convenient method called +dataWithContentsOfURL:options:error:. While convenient, it also blocks execution of the current thread, which meant it was basically useless for production code (Ignoring NSOperation). I used this method so infrequently, I completely forgot that it existed. Until recently.The way I’ve been grabbing data from the tubes is the standard NSURLConnectionDelegate approach: Write a download class that handles the various NSURLConnectionDelegate met
chris
javascript jquery ios cordova
I am using phonegap 3.0.0And in the process of attempting to upload a file to the server I am getting a couple things that are unexpected. First thing, a script error:Error: SyntaxError: Unexpected token ‘:’ line 624 of phonegap.js(which I think is a lot older version of the js to begin with, as I could only find one on github)Next thing I am getting which I don’t see why/how this would prompt when I have never seen it on other apps.. is a little alert dialog: When I click OK on the dialog thats
Rob Caraway
ios objective-c xcode xcodebuild osx-server
I’m getting this error when trying to use an Xcode bot to integrate for unit tests:Integration failed. Unexpected error while building. See the integration’s logs for more details.I couldn’t make any sense of the logs. They pretty much spit out any error with the system whether related or not.I use a remote repository from from Github, testing currently on only 1 device. I’m .gitignore-ing the standard stuff from Xcode.Here is the most relevant messages from the log I could find:’2014-01-01 10
John Riselvato
ios csv
I’m trying to implement a label that shows me the current location’s weather data as the following:NSString *request = [NSString stringWithFormat:@”http://api.worldweatheronline.com/free/v1/weather.ashx?q=%@&format=csv&num_of_days=0&show_comments=no&key=myKeyThatIRemovedForThisQuestion”,city];NSURL *URL = [NSURL URLWithString:request];NSError *error;NSString *csv = [NSString stringWithContentsOfURL:URL encoding:NSASCIIStringEncoding error:&error];NSArray *items = [csv compone
Luka
ios avfoundation
I’m trying to use the method appendPixelBuffer:withPresentationTime: of the class AVAssetWriterInputPixelBufferAdaptor. I really struggle to understand how the parameter presentationTime is meant to be used.Let’s consider for example the case where I have to create a video of 4.32 seconds with an image.what I’m doing right now is to create a buffer with the imageCVPixelBufferRef buffer = ….and then using [adaptor appendPixelBuffer:buffer withPresentationTime:presentationTime];I’ve tried both
SpacyRicochet
iphone cocoa-touch avfoundation avplayer live-streaming
I’m trying to create a more generic media controller for several types of streaming media and want to adapt the UI to the type of stream;When it’s an on-demand file stream (i.e. a single MP3 file that’s being streamed), you should be able to seek forward and backward. Thus, the seek slider should be visible. When it’s a live stream, it isn’t possible to seek forward and backward, and thus the seek slider should be hidden.Is there any way to determine from the AVPlayer (or perhaps the AVPlayerIte
Gaurav Garg
ios iphone objective-c xcode avfoundation
I want to control the shutter speed of the AVCaptureDevice.By doing search on google I found this change shutter speed and came to know that there are runtime headers methods for AVCaptureDevice that are not available under the developer.apple.So found that we can use these methods: 1.Using O-tool -But seems this is outdated or stopped by Apple. 2.Class -dump. The second one class-dump.I tried to use but not successful.I have downloaded the class-dump file and use the terminal command line but
Codo
iphone multithreading ios camera avfoundation
I’m using the AV Foundation classes to capture the live video stream from the camera and to process the video samples. This works nicely. However, I do have problems properly releasing the AV foundation instances (capture session, preview layer, input and output) once I’m done.When I no longer need the session and all associated objects, I stop the capture session and release it. This works most of the time. However, sometimes the app crashes with a EXEC_BAD_ACCESS signal raised in second thread
Topsakal
ios ios7 avfoundation
One of the apps that I have developed long ago (compiled for iOS 4) started to crash after iOS7 update. I opened the app using XCode 5 and tried to compile. I am getting an error from AVSpeechSynthesis.h file. My app’s main functionality is to play mp3 audio files. In the header of the .mm file that plays audio, I have the following headers:#import <Foundation/Foundation.h> #import <AudioToolbox/AudioToolbox.h> #import <AudioToolbox/AudioServices.h> #import <CoreGraphics/Cor
lunadiviner
objective-c ios xcode avfoundation
I am trying to implement video capture in my app using AVFoundation. I have the following code under viewDidLoad:session = [[AVCaptureSession alloc] init]; movieFileOutput = [[AVCaptureMovieFileOutput alloc] init]; videoInputDevice = [[AVCaptureDeviceInput alloc] init]; AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];if (videoDevice) {NSError *error;videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];if (!error){if ([session canAddInput
adbie
iphone ios opengl-es-2.0 avfoundation video-processing
I need to process the video frames from a remote video in real-time and present the processed frames on screen.I have tried using AVAssetReader but because the AVURLAsset is accessing a remote URL, calling AVAssetReader:initWithAsset will result in a crash.AVCaptureSession seems good, but it works with the camera and not a video file (much less a remote one).As such, I am now exploring this: Display the remote video in an AVPlayerLayer, and then use GL ES to access what is displayed. Questions:H
user1754032
objective-c audio video avfoundation avassetwriter
I’m trying to take a video created using the iVidCap plugin and add audio to it. Basically the exact same thing as in this question: Writing video + generated audio to AVAssetWriterInput, audio stuttering. I’ve used the code from this post as a basis to try and modify the iVidCap.mm file myself, but the app always crashes in endRecordingSession.I’m not sure how I need to modify endRecordingSession to accomodate for the audio (the original plugin just creates a video file). Here is the function:-
dpcasady
objective-c ios avfoundation
I’m reading LPCM samples from a track in the iPod library, by means of the export functionality of AV Foundation. Most of my code is borrowed from Chris Adamson’s example here.I’m setting up a new CMBlockBufferRef and retaining it with CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer. According to the apple CFType reference, I need to use CFRelease on any object I explicitly retain.The problem is that I can’t seem to figure out where exactly to release the CMBlockBufferRef object. If I do
Brad Larson
ios avfoundation gpuimage
GPUImage has allowed me to manipulate my video files incredibly efficiently in just the way I desire, but only if I use AVFileTypeQuickTimeMovie when I specify the fileType of the GPUImageMovieWriter.My application needs to interact with Android devices which are incapable of playing quicktime movie files, so I attempted to change my code to encode as follows:movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake([UIScreen mainScreen].bounds.size.width, [UIScreen ma
Web site is in building