ios,opengl-es,gpuimageRelated issues-Collection of common programming errors
m1crdy
ios cordova
I´m getting this error in XCode:2013-08-23 14:36:18.284 Tell The DJ[14955:c07] ERROR: Plugin ‘Device’ not found, or is not a CDVPlugin. Check your plugin mapping in config.xml. 2013-08-23 14:36:18.284 Tell The DJ[14955:c07] -[CDVCommandQueue executePending] [Line 116] FAILED pluginJSON = [“Device1096677259″,”Device”,”getDeviceInfo”,[] ] 2013-08-23 14:36:18.285 Tell The DJ[14955:c07] CDVPlugin class CDVConnection (pluginName: NetworkStatus) does not exist. 2013-08-23 14:36:18.285 Tell The DJ[1495
Mexyn
objective-c ios delegates uiscrollview subclass
here is what i want to achieve:I want to subclass an UIScrollView to have additional functionality. This subclass should be able to react on scrolling, so i have to set the delegate property to self to receive events like:- (void) scrollViewDidEndDecelerating:(UIScrollView *)scrollView { … }On the other hand, other classes should still be able to receive these events too, like they were using the base UIScrollView class.So i had different ideas how to solve that problem, but all of these are n
clint
javascript objective-c ios phonegap cordova
I am using phonegap/cordova(2.1.0) to create IOS app. i want to call a javascript function in index.html file of the phonegap from an objective-c function. So, i am creating an instance ‘theWebView’ of the ‘UIWebView’ class like below:Code in AppDelegate.h:#import <UIKit/UIKit.h>#import <Cordova/CDVViewController.h> #import “sqlite3.h”@interface AppDelegate : NSObject < UIApplicationDelegate > {NSString* invokeString;}@property (nonatomic, strong) IBOutlet UIWebView* theWebView
Grangji
ios streaming radio
Currently I am developing an app that will stream a radio station from the internet. I watched this tutorial and implemented it step by step. It worked fine for a while but now I am getting the following error and the app wont stream anything2013-07-20 10:22:40.653 ShqipCom[464:c07] [MPAVController] Autoplay: Enabling autoplay 2013-07-20 10:22:40.668 ShqipCom[464:c07] [MPCloudAssetDownloadController] Prioritization requested for media item ID: 0 2013-07-20 10:22:41.129 ShqipCom[464:c07] [MPAVCon
Alex Chumbley
ios objective-c core-data parse.com facebook-sdk-3.0
in my iOS app, we have our user log-in using facebook to grab some of their information. I’ve included the code below that occurs when the users presses the button that reads “Log in with Facebook”:- (IBAction)toQuadAction:(id)sender {// Query to fetch the users name and picture NSString *query = @”SELECT name, username FROM user WHERE uid=me() “;// Set up the query parameter NSDictionary *queryParam = [NSDictionary dictionaryWithObjectsAndKeys:query, @”q”, nil]; // Make the API request that use
user2358649
ios opengl-es textures render movie
I am generating a movie through rendering OpenGL textures. Some frames of the resulting movie look not completely rendered, as they show a part of the previous frame. If I add a NSThread [NSThread sleepForTimeInterval:0.05]; the problem does not show up but I can’t rely on this instruction.This is the result without [NSThread sleepForTimeInterval:0.05]This is the result when adding [NSThread sleepForTimeInterval:0.05]The code I use is:dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIOR
SpacyRicochet
ios facebook ios5 facebook-ios-sdk
I’m trying to allow the user to post a simple Facebook status update, without using OpenGraph. So far, allowing the user to log in and asking for publish_actions permissions, goes well without a hitch. However, when I try to call presentShareDialogWithLink:name:caption:description:picture:clientState:handler: it always returns nil and shows nothing. It doesn’t even appear to call the handler, which leaves at a loss as to why it doesn’t work.What are the reasons that this can fail? If I knew what
JeanSeb
ios air air-native-extension
for numbers in (), see links at the endI am developping an Air Native Extension (ANE) in order to be able to use the burstly ads on mobile devices.As for my setup, I am using the FlashDevelop 4.0.4 and Adobe Air 3.3. I’m using Flex 4.6.0 SDK. The project is setup on Windows7.In order to work on iOS devices, Burstly requires a number of frameworks and libraresI managed to write and compile the .a library for iOS. I also have the interface (actionscript code) that will be shared for Android and iO
Ben Holmes
ios image video monotouch camera
I have been attempting to do some real time video image processing in MonoTouch. I’m using AVCaptureSession to get frames from the camera which works with an AVCaptureVideoPreviewLayer.I also successfully get the callback method “DidOutputSampleBuffer” in my delegate class. However every way that I have tried to create a UIImage from the resulting CMSampleBuffer fails.Here is my code setting up the capture session:captureSession = new AVCaptureSession ();captureSession.BeginConfiguration ();vide
Nicholas Petersen
ios iframe uiwebview
It sounds like the native iOS app code can receive events whenever a link is clicked within a UIWebView, but is this still true when the link is within an iframe within the UIWebView?See here, here, here, and etc.In our case, we just want to display a blog post (etc) within an iframe, with that iframe wrapped by a bare html document. But if they click any links from there, we want to have the app detect that and have it navigate outside of our app to the external browser (perhaps with an alert w
Synergy807
android opengl-es android-activity glsurfaceview
I’ve been assigned to create an open source Java port of this Objective C GPUImage Framework so that it can be used in an Android application. I am to recreate it as closely as I can, with all the variable names, function names, etc all the same. I’m in the beginning stages and I’m trying to port GPUImageOpenGLESContext.h and GPUImageOpenGLESContext.m (Sorry, would provide links, but as a new users I cannot add any more links).I’m having difficulty with these methods+ (GLint)maximumTextureSize
user2358649
ios opengl-es textures render movie
I am generating a movie through rendering OpenGL textures. Some frames of the resulting movie look not completely rendered, as they show a part of the previous frame. If I add a NSThread [NSThread sleepForTimeInterval:0.05]; the problem does not show up but I can’t rely on this instruction.This is the result without [NSThread sleepForTimeInterval:0.05]This is the result when adding [NSThread sleepForTimeInterval:0.05]The code I use is:dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIOR
Waneck
iphone ios opengl-es cocos2d-iphone antialiasing
I am new to opengl programming, but I am doing something very basic, and the difference of quality between a custom opengl code and cocos2d is huge!I am trying just to load an image and continuously rotate it every frame. With my code, I get a lot of flickering, sharp edges, while cocos2d has it all nice and smooth. I’ve set up 4x Multi-Sampling Anti-Aliasing using apple’s recommended code for iOs 4 on my code, and still it looks very bad in comparison to cocos2d without any MSAA.You can see the
Karol159
java android c opengl-es android-ndk
I’m troubled with retrieving bitmap out of OpenGL ES 1.0 rendering. Graphics rendering is done in Android NDK and I want to display bitmap in Android Java. I know that I have to use glReadPixels function.I’ve tried to pass Bitmap object to NDK and modify it, as it is shown in bitmap-plasma sample from NDK, but an error is occurring: “GL_INVLID_EXCEPTION” and bitmap is untouched. I have tried combinations with ABGR bitmap, but without success. There is my code: void Java_com_example_polygonmap_Po
genpfault
android animation video text opengl-es
We are trying to create a text animation like Transition and Zoom on the Video. We are not able to get Smooth transition and the effect is so jerky. Then we understood sub pixel rendering is not possible in android. What alternatives can we think of? Can we solve this problem in OpenGL?Ok… I would like to add more to my question….We are actually trying to implement text animations on top of a video which is played by a native player on a surface view. We tried to achieve the text animation
Jakob
android memory opengl-es
I have a situation where I need to store Opengl textures in the memory, immediatelly available for further rendering. Textures are of high resolution, coming from camera.The amount of the textures and their sizes are directly related to the user experience as well as the capabilities of the device. I need to load as much of texture data as possible, without potentially running into any memory problems later.How do I determine the maximum memory I can use for textures without affecting other part
genpfault
ios opengl-es opengl-es-2.0
I use such code to setup my framebuffer:glGenRenderbuffers(1, &colorBuffer_) ;glBindRenderbuffer(GL_RENDERBUFFER, colorBuffer_);if (!colorBuffer_) {NSLog(@”glGenRenderbuffers() failed”);break; }[self.context renderbufferStorage:GL_RENDERBUFFER fromDrawable:drawable_]; glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &width_); glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &height_);glGenFramebuffers(1, &fbo);glBindFramebuffer(GL_FRAMEBUF
John
android opengl-es egl
I want to do off-screen image processing on Android in native code, so I need create the openGL context in native code by EGL.By EGL, we can create EGLSurface, I can see there are three choices there: * EGL_WINDOW_BIT * EGL_PIXMAP_BIT * EGL_BUFFER_BITThe first one is for on-screen processing, the second one is for off-screen, so I use EGL_PIXMAP_BIT like this:// Step 1 – Get the default display. EGLDisplay eglDisplay = eglGetDisplay((EGLNativeDisplayType) 0); if ((eglDisplay = eglGetDisplay(EGL_
Andi Droid
android opengl-es compression textures
I’m working on an Android OpenGL App with lots of textures and meshes. For textures I’m using ETC1 compression (most textures are 2048×1024 pixels).After testing my actual build on Sony Ericsson Xperia Arc I got an OutOfMemory exception. Searching for this problem the whole day I tested on a Nexus One with the same exception. The new Samsung Nexus (Prime) or Samsung Note worked without problems.I’ve detected that the native heap on the Xperia Arc and Nexus One grows dramatically. I found that th
user1121537
android opengl-es shader
Shader compilation fails if I want to use any native interface above my SurfaceView.If I start application like this everything is ok:surfaceView = new mSurfaceView(this); renderer = new mRender(); surfaceView.setRenderer(renderer); setContentView(surfaceView);but if I want to use any interface it fails. I’m applying layout like this:setContentView(R.layout.main); surfaceView = (mSurfaceView) findViewById(R.id.gl_surface_view); renderer = new mRender(); surfaceView.setRenderer(renderer);App cras
Robert
ios image-processing gpuimage
I have tried to setup a simple GPUImage environment in a UIViewControler. However, I get a blank screen. I want to animate a property of the swirl filter whilst the source image is kept constant. How can I do this?I want good performance (> 30 fps) for large-ish images so would ideally like to avoid unnecessary pixel copies back and forth from GPU to CPU.- (void)viewDidLoad {UIImage *inputImage = [UIImage imageNamed:@”image.jpg”];self.stillImageSource = [[GPUImagePicture alloc] initWithImage:
zakinaeem
objective-c gpuimage avmutablecomposition
I am working on integrating a video player with the very powerful GPUImage framework (https://github.com/BradLarson/GPUImage), whereby user can queue videos for continuous playback (AVMutableComposition) and also select a number of GPUImage filters to be applied to selected clips in the queue on runtime. Say clips number 2 and 4 out of the total 5. The key is that I have unfiltered clips and cannot afford to generate and save “filtered” versions of them for use so this has to be runtime.Is there
Brad Larson
ios avfoundation gpuimage
GPUImage has allowed me to manipulate my video files incredibly efficiently in just the way I desire, but only if I use AVFileTypeQuickTimeMovie when I specify the fileType of the GPUImageMovieWriter.My application needs to interact with Android devices which are incapable of playing quicktime movie files, so I attempted to change my code to encode as follows:movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake([UIScreen mainScreen].bounds.size.width, [UIScreen ma
shannoga
ios gpuimage application-state gpuimagestillcamera
I know it’s sounds silly but just to clear a point. Is there any chance that view did load will be called before didBecomeActive ? Is it totally impossible ?EDITWe have a crash that happens when user is coming back to the app from the background and we start to use openGL. The crash error points that we try to use openGL in the background.It is important to say that our app lives in the background as a VOIP app.We try to figure out if there is a chance that somehow we are triggering something in
MarkPowell
ios opengl-es vbo gpuimage
I am making use of Brad Larson’s wonderful GPUImage library for image manipulation. So far, it’s been great. However, I’m trying to add a filter to allow Mesh Deformation and running into quite a bit of issues. Specifically, I want to have a filter that uses VBO to render the Quad so I can ultimately dynamically change the vertices for the deformation. The first step of using VBOs is causing a crash.I created a subclass of GPUImageFilter overriding the – (void)newFrameReadyAtTime:(CMTime)frameTi
genpfault
ios opengl-es gpuimage
We decoding MP4-compressed video with mask embedded using GPUImageMovie.Problem that if user presses Home button during video decoding – the app became crashed because of OpenGL trap. iOS does not allow to use OpenGL in background.Is there any way to tell GPUImageMovie to stop decoding immediately, including video thread?
RomanHouse
iphone objective-c ios gpuimage
After I made [_stillCamera pauseCameraCapture]; I want to save captured photo. Code:[_stillCamera capturePhotoAsJPEGProcessedUpToFilter:_filter withCompletionHandler:^(NSData *processedJPEG, NSError *error){[self savePhotoToAlbum:processedJPEG];if (error) {NSLog(@”error”)} }];In first time photo saved successfully. After this I didn’t make [_stillCamera resumeCameraCapture];, but when I tap save button again my application crashes without any message. How can I solve this?
tiltem
ios avassetwriter gpuimage
Trying to get started with the really great GPUImage framework so graciously shared by Brad Larson, but having an issue. When running the SimpleVideoFileFilter sample it always crashes at completion with the following error:[AVAssetWriterInput markAsFinished] Cannot call method when status is 2Anyone know how to correct this? Also do not see the video when run in simulator, does it not work for simulator?Running iOS 6.1 and Xcode 4.6Thanks!I am noticing that finishRecordingWithCompletionHandler
Student
iphone ios objective-c gpuimage
I am working a picture/video filter effect project. For making effect i am using GPUImage project. It works fine for pictures. Now i need to do the same effect on videos too. I grab images from video at 30 frames per second. Now for 1 min video i filtered about 1800 images. and filtering, for each image i allocate GPUImagePicture and GPUImageSepiaFilter classes, and release them manully. But these allocations is not released, and after processing on about 20 sec video application crashes due to
jokerdc
ios gpuimage
I started from Brad Larson’s Tutorial on Github.Here when i added these code into my project- (void)viewDidLoad { [super viewDidLoad];GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack]; videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait; GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@”CustomShader”]; GPUImageView *filteredVideoView
Web site is in building