ios,monotouch,avfoundation,libgdx,objectalRelated issues-Collection of common programming errors
m1crdy
ios cordova
I´m getting this error in XCode:2013-08-23 14:36:18.284 Tell The DJ[14955:c07] ERROR: Plugin ‘Device’ not found, or is not a CDVPlugin. Check your plugin mapping in config.xml. 2013-08-23 14:36:18.284 Tell The DJ[14955:c07] -[CDVCommandQueue executePending] [Line 116] FAILED pluginJSON = [“Device1096677259″,”Device”,”getDeviceInfo”,[] ] 2013-08-23 14:36:18.285 Tell The DJ[14955:c07] CDVPlugin class CDVConnection (pluginName: NetworkStatus) does not exist. 2013-08-23 14:36:18.285 Tell The DJ[1495
Mexyn
objective-c ios delegates uiscrollview subclass
here is what i want to achieve:I want to subclass an UIScrollView to have additional functionality. This subclass should be able to react on scrolling, so i have to set the delegate property to self to receive events like:- (void) scrollViewDidEndDecelerating:(UIScrollView *)scrollView { … }On the other hand, other classes should still be able to receive these events too, like they were using the base UIScrollView class.So i had different ideas how to solve that problem, but all of these are n
clint
javascript objective-c ios phonegap cordova
I am using phonegap/cordova(2.1.0) to create IOS app. i want to call a javascript function in index.html file of the phonegap from an objective-c function. So, i am creating an instance ‘theWebView’ of the ‘UIWebView’ class like below:Code in AppDelegate.h:#import <UIKit/UIKit.h>#import <Cordova/CDVViewController.h> #import “sqlite3.h”@interface AppDelegate : NSObject < UIApplicationDelegate > {NSString* invokeString;}@property (nonatomic, strong) IBOutlet UIWebView* theWebView
Grangji
ios streaming radio
Currently I am developing an app that will stream a radio station from the internet. I watched this tutorial and implemented it step by step. It worked fine for a while but now I am getting the following error and the app wont stream anything2013-07-20 10:22:40.653 ShqipCom[464:c07] [MPAVController] Autoplay: Enabling autoplay 2013-07-20 10:22:40.668 ShqipCom[464:c07] [MPCloudAssetDownloadController] Prioritization requested for media item ID: 0 2013-07-20 10:22:41.129 ShqipCom[464:c07] [MPAVCon
Alex Chumbley
ios objective-c core-data parse.com facebook-sdk-3.0
in my iOS app, we have our user log-in using facebook to grab some of their information. I’ve included the code below that occurs when the users presses the button that reads “Log in with Facebook”:- (IBAction)toQuadAction:(id)sender {// Query to fetch the users name and picture NSString *query = @”SELECT name, username FROM user WHERE uid=me() “;// Set up the query parameter NSDictionary *queryParam = [NSDictionary dictionaryWithObjectsAndKeys:query, @”q”, nil]; // Make the API request that use
user2358649
ios opengl-es textures render movie
I am generating a movie through rendering OpenGL textures. Some frames of the resulting movie look not completely rendered, as they show a part of the previous frame. If I add a NSThread [NSThread sleepForTimeInterval:0.05]; the problem does not show up but I can’t rely on this instruction.This is the result without [NSThread sleepForTimeInterval:0.05]This is the result when adding [NSThread sleepForTimeInterval:0.05]The code I use is:dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIOR
SpacyRicochet
ios facebook ios5 facebook-ios-sdk
I’m trying to allow the user to post a simple Facebook status update, without using OpenGraph. So far, allowing the user to log in and asking for publish_actions permissions, goes well without a hitch. However, when I try to call presentShareDialogWithLink:name:caption:description:picture:clientState:handler: it always returns nil and shows nothing. It doesn’t even appear to call the handler, which leaves at a loss as to why it doesn’t work.What are the reasons that this can fail? If I knew what
JeanSeb
ios air air-native-extension
for numbers in (), see links at the endI am developping an Air Native Extension (ANE) in order to be able to use the burstly ads on mobile devices.As for my setup, I am using the FlashDevelop 4.0.4 and Adobe Air 3.3. I’m using Flex 4.6.0 SDK. The project is setup on Windows7.In order to work on iOS devices, Burstly requires a number of frameworks and libraresI managed to write and compile the .a library for iOS. I also have the interface (actionscript code) that will be shared for Android and iO
Ben Holmes
ios image video monotouch camera
I have been attempting to do some real time video image processing in MonoTouch. I’m using AVCaptureSession to get frames from the camera which works with an AVCaptureVideoPreviewLayer.I also successfully get the callback method “DidOutputSampleBuffer” in my delegate class. However every way that I have tried to create a UIImage from the resulting CMSampleBuffer fails.Here is my code setting up the capture session:captureSession = new AVCaptureSession ();captureSession.BeginConfiguration ();vide
Nicholas Petersen
ios iframe uiwebview
It sounds like the native iOS app code can receive events whenever a link is clicked within a UIWebView, but is this still true when the link is within an iframe within the UIWebView?See here, here, here, and etc.In our case, we just want to display a blog post (etc) within an iframe, with that iframe wrapped by a bare html document. But if they click any links from there, we want to have the app detect that and have it navigate outside of our app to the external browser (perhaps with an alert w
Ben Holmes
ios image video monotouch camera
I have been attempting to do some real time video image processing in MonoTouch. I’m using AVCaptureSession to get frames from the camera which works with an AVCaptureVideoPreviewLayer.I also successfully get the callback method “DidOutputSampleBuffer” in my delegate class. However every way that I have tried to create a UIImage from the resulting CMSampleBuffer fails.Here is my code setting up the capture session:captureSession = new AVCaptureSession ();captureSession.BeginConfiguration ();vide
Martin Stafford
.net mono monotouch
I am aware that the compiler can sometimes omit methods if it thinks nothing uses them and that you can work around this by referencing the method directly in your code to force the compiler to include it. However, in this instance it doesn’t seem to work.Here’s the exception and call stack:System.ExecutionEngineException: Attempting to JIT compile method ‘(wrapper managed-to-native) System.Threading.Interlocked:Exchange (System.Threading.Tasks.IContinuation&,System.Threading.Tasks.IContinu
poupou
iphone .net ios monotouch ews-managed-api
I’m trying to build iOS app using Microsoft EWS managed API. I’ve downloaded the msi from http://www.microsoft.com/download/en/details.aspx?id=13480 and copied the DLL under the solution folder. When building for the simulator, it builds and runs fine, but when trying to target the physical device, mtouch fails with:Compiling to native code /Developer/MonoTouch/usr/bin/mtouch -v –nomanifest –nosign -dev “/Users/Projects/HelloMonoTouch/MyEWSApp/bin/iPhone/Release/MyEWSApp.app” -r “/Developer/M
BitKFu
c# serialization monotouch protobuf-net
We are currently facing the problem that we want to port an application that is currently running on Android (Monodroid) and Wp7 to IOS by using MonoTouch.That would be not the problem, but deserializing data by using the protobuf-net framework constantly fails with the following exception:ProtoBuf.ProtoException: Invalid wire-type; this usually means you have over-written a file without truncating or setting the length; see http://stackoverflow.com/q/2152978/23354at ProtoBuf.ProtoReader.StartSu
Dan
javascript ios uiwebview monotouch onerror
Here’s an answer for a similar question in Objective C but I’m not sure what’s the right way to translate it to MonoTouch.Basically, I want to be able to catch JavaScript errors and know at least filename and line number-unfortunately, window.onerror doesn’t give this crucial information.In particular, I’m not sure if I should expose a native library or if I can write this in pure MonoTouch.
TenaciousG
nunit monotouch
Has anyone successfully managed to run unit tests for their Monotouch project? I have read other posts where people are told to manually add references to the appropriate assemblies. Doing this makes the project compile, but it will still not run the tests.I have a solution with two projects; a Monotouch Navigation project and an NUnit Library Project. I added a reference to my monotouch project and to the monotouch and other needed assemblies to the test project. Tests that only runs code outsi
poupou
ios cocoa-touch linker monotouch static-libraries
I have created a Cocoa Touch Static Library using Xcode 4 and I want to use it in a MonoTouch project. How do I proceed?Here is the content of my Static Library :MyClass.hMyClass.mI built using “Build For Archiving” after following THIS BLOG POST and I took the libMyLib.a it generated and added it to a new MonoTouch Binding Project.Then I replaced the content of libMyLib.linkwith.cs, because THIS BLOG POST said so. [assembly: LinkWith (“libMyLib.a”, LinkTarget.ArmV6 | LinkTarget.ArmV7 | LinkTar
Aaron
monotouch in-app-purchase xamarin
In my Xamarin app when I call this methodprivate void MakePayment (SKProduct product) {SKPayment payment = SKPayment.PaymentWithProduct (product);SKPaymentQueue.DefaultQueue.AddPayment (payment); }I get this error:Failed to marshal the Objective-C object 0x14607110 (type:SKPaymentTransaction). Could not find an existing managed instance forthis object, nor was it possible to create a new managed instance(because the type ‘MonoTouch.StoreKit.SKPaymentTransaction[]’ does nothave a constructor that
SmartK8
osx monotouch install version
I’ve run into a problem with MonoTouch. I was developing in MonoDevelop as usual, when update screen popped-up, and I’ve complied, and let it install updates, and restart MonoDevelop. But as soon as I’ve tried to compile the project for iPhone. It complained, that my license has expired. I’ve checked, and it was unfortunately correct. So I’ve checked for my options, and tried new free MonoTouch option first, but that failed to install because my Mac OS X being only version 10.6.7.So I fall back
krtrego
c# objective-c monotouch xamarin
Note: this isn’t an app for the App Store, so using private API’s isn’t a concern to me.I’ve been trying to toggle Airplane mode off and back on using Xamarin/Monotouch based off this SO question: How to turn on/off airplane mode in IOS 5.1 using private APII’m having a hard time getting this code to run in monotouch – first thing I tried was to create a native objective C library using a monotouch binding project.The AppSupport framework is used so my linkwith file is as follows:[assembly: Link
jlw
ios avfoundation avasset avassetreader
After implementing the solution to encoding video (with audio) in this question, Video Encoding using AVAssetWriter – CRASHES, I found that the code works correctly in the iPhone Simulator. Unfortunately, certain videos fail to encode their audio while running on an actual iPhone 5 (and other devices).For example, videos generated from the WWDC 2011 sample code RosyWriter (https://developer.apple.com/library/IOS/samplecode/RosyWriter/Introduction/Intro.html) do not completely encode because the
Luka
ios avfoundation
I’m trying to use the method appendPixelBuffer:withPresentationTime: of the class AVAssetWriterInputPixelBufferAdaptor. I really struggle to understand how the parameter presentationTime is meant to be used.Let’s consider for example the case where I have to create a video of 4.32 seconds with an image.what I’m doing right now is to create a buffer with the imageCVPixelBufferRef buffer = ….and then using [adaptor appendPixelBuffer:buffer withPresentationTime:presentationTime];I’ve tried both
SpacyRicochet
iphone cocoa-touch avfoundation avplayer live-streaming
I’m trying to create a more generic media controller for several types of streaming media and want to adapt the UI to the type of stream;When it’s an on-demand file stream (i.e. a single MP3 file that’s being streamed), you should be able to seek forward and backward. Thus, the seek slider should be visible. When it’s a live stream, it isn’t possible to seek forward and backward, and thus the seek slider should be hidden.Is there any way to determine from the AVPlayer (or perhaps the AVPlayerIte
Gaurav Garg
ios iphone objective-c xcode avfoundation
I want to control the shutter speed of the AVCaptureDevice.By doing search on google I found this change shutter speed and came to know that there are runtime headers methods for AVCaptureDevice that are not available under the developer.apple.So found that we can use these methods: 1.Using O-tool -But seems this is outdated or stopped by Apple. 2.Class -dump. The second one class-dump.I tried to use but not successful.I have downloaded the class-dump file and use the terminal command line but
Codo
iphone multithreading ios camera avfoundation
I’m using the AV Foundation classes to capture the live video stream from the camera and to process the video samples. This works nicely. However, I do have problems properly releasing the AV foundation instances (capture session, preview layer, input and output) once I’m done.When I no longer need the session and all associated objects, I stop the capture session and release it. This works most of the time. However, sometimes the app crashes with a EXEC_BAD_ACCESS signal raised in second thread
Topsakal
ios ios7 avfoundation
One of the apps that I have developed long ago (compiled for iOS 4) started to crash after iOS7 update. I opened the app using XCode 5 and tried to compile. I am getting an error from AVSpeechSynthesis.h file. My app’s main functionality is to play mp3 audio files. In the header of the .mm file that plays audio, I have the following headers:#import <Foundation/Foundation.h> #import <AudioToolbox/AudioToolbox.h> #import <AudioToolbox/AudioServices.h> #import <CoreGraphics/Cor
lunadiviner
objective-c ios xcode avfoundation
I am trying to implement video capture in my app using AVFoundation. I have the following code under viewDidLoad:session = [[AVCaptureSession alloc] init]; movieFileOutput = [[AVCaptureMovieFileOutput alloc] init]; videoInputDevice = [[AVCaptureDeviceInput alloc] init]; AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];if (videoDevice) {NSError *error;videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];if (!error){if ([session canAddInput
adbie
iphone ios opengl-es-2.0 avfoundation video-processing
I need to process the video frames from a remote video in real-time and present the processed frames on screen.I have tried using AVAssetReader but because the AVURLAsset is accessing a remote URL, calling AVAssetReader:initWithAsset will result in a crash.AVCaptureSession seems good, but it works with the camera and not a video file (much less a remote one).As such, I am now exploring this: Display the remote video in an AVPlayerLayer, and then use GL ES to access what is displayed. Questions:H
user1754032
objective-c audio video avfoundation avassetwriter
I’m trying to take a video created using the iVidCap plugin and add audio to it. Basically the exact same thing as in this question: Writing video + generated audio to AVAssetWriterInput, audio stuttering. I’ve used the code from this post as a basis to try and modify the iVidCap.mm file myself, but the app always crashes in endRecordingSession.I’m not sure how I need to modify endRecordingSession to accomodate for the audio (the original plugin just creates a video file). Here is the function:-
dpcasady
objective-c ios avfoundation
I’m reading LPCM samples from a track in the iPod library, by means of the export functionality of AV Foundation. Most of my code is borrowed from Chris Adamson’s example here.I’m setting up a new CMBlockBufferRef and retaining it with CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer. According to the apple CFType reference, I need to use CFRelease on any object I explicitly retain.The problem is that I can’t seem to figure out where exactly to release the CMBlockBufferRef object. If I do
Susei
java gwt ant libgdx
When I build with ant my project that uses libgdx, I get a strange error. It says that a class com.google.gwt.dom.client.ImageElement is not found, but it isn’t used at all in the code. How can I find what makes this class necessary? Even searching over the whole project doesn’t give any results. It says that error is at PixmapTextureAtlas.java:16 (class source), but there is no code that uses that ImageElement class. Adding the library containing com.google.gwt.dom.client.ImageElement class hel
user2510952
android libgdx
I have been follwoing the tutorial on google and other sites implementing GameHelper with LibGdx and I am not sure why I am getting this errorHere is my Main Activityimport android.content.Intent; import android.os.Bundle; import android.os.Handler; import android.os.Message; import android.view.View; import android.view.Window; import android.view.WindowManager; import android.widget.RelativeLayout;import com.CrazyEagle.utils.AdsHandler; import com.CrazyEagle.utils.GameAction; import com.badlog
BlueMonster
android android-emulator libgdx
Whenever i try to execute the application on the emulator, the emulator displays an error saying “Unfortunately, My libGDX Game has stopped.” The application runs fine in the desktop version though. I have the latest nightly version, ADT version 18, latest GWT, and latest version of eclipse. My Android SDK tools is version 19, while my android SDK platform-tools is version 11. Any ideas on how to fix this?I was following along this tutorial: LibGDX tutorialHere is a screenshot of what i see: Log
Vantrebla
box2d libgdx destroy
I´m using Libgdx with Box2D.I have a Problem with destroying a joint when a specific body crashes on the ground, the colliding bodies are detected and then I want to destroy some joints. Always when I will do this i get an error.Also I tested the destroy method seperatly for testing and i get the same error. It´s called right after the world.step(…) is this right?I read about something that the error happens when the joints are destroyed inbetween the timestep, but how i could do this outside
G1i1ch
java scala geometry libgdx
I have Tiled maps working in libgdx, now I’m trying to do collision with these maps. I figured it’d be really useful just to draw polygons in Tiled and use them for collisions in libgdx. I’d actually prefer it this way rather than doing a per tile collision since it’ll give me collision flexibility which the game requires. I’ve successfully parsed each object’s xml and grabbed the polygon points. I’ve been able to make a box2d PolygonShape with these and it shows up in game! But it’s mirrored up
Chandrasekhar Malladi
memory-leaks compiler-errors libgdx
As I have previously asked and some people have asked me to post the dalvik compiler log,here you go. Please notify me of where exactly the memory leak is taking place and the necessary changes i’ve to make in my code. Thanks!E/dalvikvm(25715): JNI ERROR (app bug): non-zero capacity for NULL pointer: 80000I/dalvikvm(25715): “GLThread 693″ prio=5 tid=11 RUNNABLEI/dalvikvm(25715): | group=”main” sCount=0 dsCount=0 obj=0x428f5be0 self=0x73044008I/dalvikvm(25715): | sysTid=25731 nice=0 sched=0/0
P.T.
java libgdx
I’ve been trying to start some development using libgdx, and have been following the setup tutorial. https://code.google.com/p/libgdx/wiki/ProjectSetupNewHowever, when I try to run the desktop application, I get the following error:## A fatal error has been detected by the Java Runtime Environment:## EXCEPTION_ILLEGAL_INSTRUCTION (0xc000001d) at pc=0x6cee60ce, pid=1908, tid=2912## JRE version: 7.0_25-b17# Java VM: Java HotSpot(TM) Client VM (23.25-b01 mixed mode, sharing windows-x86 )# Problema
mtrc
java windows libgdx
I’m trying to export a game written in LibGDX, Java and Flixel-Android. The game was developed on a Mac, and runs on other Mac systems in Jar form. When running it on a Windows 7 machine it quits before completely starting up, and I get this dump:A fatal error has been detected by the Java Runtime Environment:EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x04a2b400, pid=5824,tid=5912JRE version: 7.0_09-b05 Java VM: Java HotSpot(TM) Client VM(23.5-b02 mixed mode, sharing windows-x86 ) Problemat
P.T.
jni libgdx dalvik
I have come across a new error in LibGdx which says non Zero Capacity for Null Pointer! What does this error mean? E/dalvikvm(28069): JNI ERROR (app bug): non-zero capacity for NULL pointer: 80000
Matt Habel
android libgdx
Hello good people of Stack Overflow, I am having trouble with an application I am writing, as evident from my title. What I am trying to do is simply play a song, looping, and whenever the screen is touched for a sound effect to play. I am doing this using the libgdx library, as I am attempting to learn it for more advanced use. Here is the code.package main;import com.badlogic.gdx.ApplicationListener; import com.badlogic.gdx.Gdx; import com.badlogic.gdx.audio.Music; import com.badlogic.gdx.audi
Erlend D.
ios monotouch avfoundation libgdx objectal
I am currently developing a small multiplatform game with libGDX. The game works great on Windows, Android and OSX, but I can’t make it compile for iOS.The libGDX guide for iOS states a couple of caveats, but I’m pretty sure I have taken everything into account. I have Eclipse, Xamarin.iOS, JDK, Ant, and $PATH, $IKVM_HOME is set as they should. The gamename.dll (which is the Mono compiled dll that contains all my game logic, and is automatically generated from the Java code) has been created, an
Web site is in building