Wednesday, December 23, 2009

Adding Email in-app dialog to OpenGL App

I encountered a problem when trying to follow this tutorial about adding in-app Email dialog. I had a general problem. I didn't have a UIViewController. I only had MyEAGLView instance, a UIView

After reading this article about UIViewController, it clicked.

All I had to do was create a UIViewController in the appFinishedLoading and then I added the opengl view to the viewcontroller. Then made the view of the view controller as a subview of the window.

viewController = [[UIViewController alloc] init];
[viewController.view addSubview:glView];

[window addSubview:viewController.view];

Tuesday, December 22, 2009

iPhone Ad Hoc distribution

I am just going to paste some links that helped me out through the process of nailing down a release that I can give to my friends for beta testing. Just need to fix some bugs before giving them the beta :)

Process of setting up ad hoc build

Tutorial to give to users to get the UDID

Tutorial to give to users for installing beta

Debugging Crash Logs

Tuesday, December 15, 2009

Xmas tree - gamedev analogy

Today I was assembling our new christmas tree, when I started comparing the process with developing a game.

When you start assembling the tree you first put in place the base, then the pillars where you will start hooking in the branches layer by layer.
Well, even in gamedev we have to make the skeleton, some basic engine where you can start building a prototype, feature by feature.

Then comes in the decorations, tinsel, lights and the star at the top of the tree. Even in gamedev you have polishing, polishing, polishing, and even more... you guessed it... POLISHING! It's those small details that make a christmas tree look great. That also counts for games (anything really)

Best wishes to all.

Monday, November 23, 2009

Localization/Internationalization/i18n for iPhone apps

I have found this internationalization guide on how to make your iPhone app behave appropriately according to the iPhone's language preference. Very easy to get started. Ideally the strings are externalized as from day 1. It's a bit time consuming externalizing them later on :(

Friday, November 20, 2009

High Score System in place

I'm still working on the iphone game, but only a couple of hours a week unfortunately. I have created a High Score system using Rails on Heroku is a very nice Rails hosting service - which is also free for a basic setup and easily be ramped up in a jiffy.

The next think I need to check is the in-app purchases for downloadable content. Should be interesting...

Sunday, October 25, 2009

Updating the Provisioning Profile

When my provisioning profile expired, I had to create another one. But I had problems hooking it with the project. This how-to saved the day.

Tuesday, September 29, 2009

Unit testing on the iPhone SDK

I have been busy preparing lectures (still am) but I'm still alive :)


I realised that I was adding code to the iPhone project more like in a prototyping stage. I wasn't doing any tests but I still tried to organize the classes as best as I could. Now I'm starting to loose confidence when I'm going to add some features. Not a good feeling. I want to switch to TDD (Test Driven Development), where you first write a failing test, and then write the production code. The excellent side effect of this is that the design of the classes are automatically decoupled. However I need to first convert my project to be covered by tests. Found an interesting book called "Working Effectively with Legacy Code" where it gives a lot of tips on how to breakdown classes, etc. I'm still reading it in my free time, but I will soon be putting the tips to work.

Setting up test harness
I searched for setting up a test harness in XCode. Version 3 supports unit tests using the SenTesting classes. I found a presentation and an Apple support page to set up a test harness. It also describes for setting up functional testing, however I will be only using unit (logic) tests for now.

Some tips to finalize the setup

Remember to drag any .m files into the Compile Sources in the logic tests target and also any libraries.
Also edit the Active target LogicTests and under Build tab, in Gcc 4.2 - Language section, turn on Precompile Prefix Header and also set the Prefix Header to _Prefix.pch
It is very important to this as otherwise it will give you loads of errors like could not find CGPoint class, etc

NSLog...where are they output?
What about logging when running the tests? Since to execute the unit tests you just need to build the application, there is not console out from XCode, however you can view the output of NSLog statements in the Console application. Launch Console from Spotlight.

Code Coverage
I also wanted to have code coverage, to know how much of the code is being tests by the tests. I found an excellent tutorial on setting up code coverage. It uses CoverStory - a tool for viewing the results of code coverage.

Now I need to rewire my brain to think in tests...

Friday, August 21, 2009

Creating sparks/bolts in Opengl ES

I wanted to create some sparks/lightening/bolts kinda thing on the iPhone/iPod touch using OpenGL ES. I made a quick google and found this Delphi Opengl Project. I adapted it and created a quick spark object. Here is the draw method involved:

-(void) draw
#define random ((float)random()/RAND_MAX)
// initialise the start and end points
yDisp[0] = yDisp[STEPS-1] = 0;

// calculate new Y coordinate. new = old + random.
for (int i = 1; i <> yDisp[i-1] + 0.075f) yDisp[i] = yDisp[i-1]+0.075f;
if (yDisp[i] <> yDisp[i+1] + 0.075f) yDisp[i] = yDisp[i+1]+0.075f;
if (yDisp[i] <> 0.5f) yDisp[i] = 0.5f;
if (yDisp[i] < -0.5f) yDisp[i] = -0.5f;

// Prepare the vertices as a Triangle strip
float rnd;
for (int j = 0; j < STEPS; j++)
rnd = 0.04f*(random-0.5f); //0.4 * random between -0.5 and 0.5
vertices[j*6 + 0] = length*j/STEPS + rnd; //x between 0 and length with some slight randomness
vertices[j*6 + 1] = -halfThickness + (yDisp[j] + rnd) * amplitude; //y
vertices[j*6 + 2] = 0; //rnd; //z

vertices[j*6 + 3] = length*j/STEPS + rnd; //x
vertices[j*6 + 4] = halfThickness + (yDisp[j] + rnd) * amplitude; //y
vertices[j*6 + 5] = 0; //rnd; //z

// Draw the vertices
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
glColor4f(0.4f, 0.3f, 0.8f, 1.0f);
glVertexPointer(3, GL_FLOAT, 0, vertices);
glRotatef(angleInDegrees, 0, 0, 1);
glDrawArrays(GL_TRIANGLE_STRIP, 0, STEPS*2);

The vertices array and yDisp array would need to be malloced appopriately in the init method (and freed in the dealloc):

vertices = malloc(sizeof(GLfloat) * 3 * STEPS * 2);//3 coordinates for each vertex, 2 vertices for each step
yDisp = malloc(sizeof(float) * STEPS);

Some good values for a decent spark would be length 200, halfWidth 1, amplitude 50.
STEPS was #defined to 40 for now.

This is just the beginning. I need to add more effects like glowing endpoints and subtle particle systems.

Wednesday, August 05, 2009

gluUnProject for iPhone / OpenGL ES

I wasn't surprised that gluUnProject did not exist in the iPhone SDK since they did not provide a glu library implementation. However as I have migrated the gluLookAt, it was time to migrate the gluUnProject. It has been ages since I had used this method. The only thing I remebered was that it was used to know where you have clicked in your 3d world, by using a 2d device, i.e. a mouse. In our case since we're using iPhone, it's the capacitive touch screen which gives us back 2d coordinates of where we touched on the screen.

Since I'm using a perspective view, I needed to translate those coordinates into world coordinates. In my case, on the current project I'm working on, I would finally want to know which button the user click from a grid.

I got the gluUnProject code from MESA. However the code needed some minor adjustments, namely converting everything from double to float:
  • any GLdouble had to be replaced with GLfloat
  • any double numbers, e.g. 0.0 or 1.0 I converted them to their respective float counterpart, e.g. 0.0f or 1.0f
  • and math functions which accepted/returned double, I replaced them with their float versions, e.g. fabs -> fabsf
The next problem was that gluUnProject takes the 3 coordinates, 2 of them are retrieved from the touch event, but the Z coordinate we don't know it. On the desktop openGL usually the depth buffer is queried to retrieve the depth value (between 0 and 1) at the x-y coordinate. But we cannot do this on the iPhone as it uses tile rendering.

So I thought to use the value of zero, but in effect it was always giving the coordinates of the center of the screen. In reality it was giving me the coordinates of the camera position. When I tried 1 instead of 0, it was giving me coordinates that made more sense but still not 100% precise.

The solution was to unproject twice, one time at the near plane (z = 0) and one time at the far plane (z = 1) as discussed in a forum. That gives you a ray which can then be used to make an intersection with a plane and get the exact coordinates. Thus converting from 2d to 3d.

Here's the method I have used which uses the migrated gluUnProject using just floats:

-(CGPoint) getOGLPos:(CGPoint)winPos
// I am doing this once at the beginning when I set the perspective view
// glGetFloatv( GL_MODELVIEW_MATRIX, __modelview );
// glGetFloatv( GL_PROJECTION_MATRIX, __projection );
// glGetIntegerv( GL_VIEWPORT, __viewport );

//opengl 0,0 is at the bottom not at the top
winPos.y = (float)__viewport[3] - winPos.y;
// float winZ;
//we cannot do the following in openGL ES due to tile rendering
// glReadPixels( (int)winPos.x, (int)winPos.y, 1, 1, GL_DEPTH_COMPONENT24_OES, GL_FLOAT, &winZ );

float cX, cY, cZ, fX, fY, fZ;
//gives us camera position (near plan)
gluUnProject( winPos.x, winPos.y, 0, __modelview, __projection, __viewport, &cX, &cY, &cZ);
//far plane
gluUnProject( winPos.x, winPos.y, 1, __modelview, __projection, __viewport, &fX, &fY, &fZ);

//We could use some vector3d class, but this will do fine for now
fX -= cX;
fY -= cY;
fZ -= cZ;
float rayLength = sqrtf(cX*cX + cY*cY + cZ*cZ);
fX /= rayLength;
fY /= rayLength;
fZ /= rayLength;

//T = [planeNormal.(pointOnPlane - rayOrigin)]/planeNormal.rayDirection;
//pointInPlane = rayOrigin + (rayDirection * T);

float dot1, dot2;

float pointInPlaneX = 0;
float pointInPlaneY = 0;
float pointInPlaneZ = 0;
float planeNormalX = 0;
float planeNormalY = 0;
float planeNormalZ = -1;

pointInPlaneX -= cX;
pointInPlaneY -= cY;
pointInPlaneZ -= cZ;

dot1 = (planeNormalX * pointInPlaneX) + (planeNormalY * pointInPlaneY) + (planeNormalZ * pointInPlaneZ);
dot2 = (planeNormalX * fX) + (planeNormalY * fY) + (planeNormalZ * fZ);

float t = dot1/dot2;

fX *= t;
fY *= t;
//we don't need the z coordinate in my case

return CGPointMake(fX + cX, fY + cY);

Tuesday, August 04, 2009

Disabling Texture Units

After I managed to create multitextured polygons, I ran into another problem. After flushing the multitextured vertex array, I needed to flush a single textured vertex array for the HUD (Heads-up display, i.e. timer, score etc). The problem looked like it was using the texture coordinates of the previous vertex array.

After some googling and trying to understand what was happening, I realized that I needed to disable the 2nd texture unit. When you are using texture array pointers we use glClientActiveTexture, so before drawing the non-multitextured vertex array, I needed to disable the second texture unit and then switch back to the first texture unit

//disable 2nd texture unit
glDisable (GL_TEXTURE_2D);
//back to the 1st texture unit
glBindTexture(GL_TEXTURE_2D, [_texture name]);

Initially I tried using glActiveTexture instead of glClientActiveTexture. The glActiveTexture is used when display lists/immediate mode is used (glBegin...glEnd). And obviously nothing was happening.

Saturday, August 01, 2009

Multitexturing on Opengl ES

I wanted to create a random highlight animation effect on the buttons that glides over them every now and then. At first I was going to do it in a multipass approach, i.e. first draw the button polygons, then the highlight over them. However it should be more efficient (with loads of polygons anyway) if multitexturing is used. The iPhone has 2 Texture Units (TU) and so I can take advantage of that.

I never had done multitexturing in opengl before, so I had to learn the concept. If you understand the blending functions with the frame buffer, then multitexturing will be easy. The difference is that you can combine the results of texture units. Since we have two TUs we can make the first texture blend with the frame buffer, and then overlay the second texture by adding it to the result of the previous TU (GL_PREVIOUS).
You dictate how the TUs will combine by specifying whether GL_COMBINE_RGB is GL_MODULE, GL_ADD, GL_DECAL, and GL_REPLACE. One should take a look at what each one will calculate to, and also experiment a bit with them.

Here is more detailed information about texture combiners using a fixed pipeline, and what it would like if we used shaders. I must admit that shaders are easier to read, at least such simple shaders, but shaders are only supported in iPhone 3GS.

Before setting up the multitexturing, I set up the texture coordinates for both TUs:
glTexCoordPointer(2, GL_FLOAT, sizeof(VertexDataMultiTextured), &vd[0].uv0);
glTexCoordPointer(2, GL_FLOAT, sizeof(VertexDataMultiTextured), &vd[0].uv1);

Then I set up how the TUs should behave. In my case I had a texture for the button (in an atlas) and used the following openGL commands
glBindTexture(GL_TEXTURE_2D, [_texture name]);
//blend the texture with the framebuffer(GL_PREVIOUS)
//use the texture's alpha channel

And then I selected the second texture unit and added the color information of the glow texture (which was in the same atlas) to the result of the previous TU:
glBindTexture(GL_TEXTURE_2D, [_texture name]);
//add the previous color information with the texture's color information
//don't effect the alpha channel, use the result (GL_PREVIOUS) of the previous texture unit

Friday, July 31, 2009

Always initialize your variables... never assume default value

I had a problem which was going to drive me insane. The app on the simulator worked fine, but not on the device. I introduced depth in the prototype I'm working on, where buttons need to rotate and look 3d-ish. So when I added the z value, initially this was going to be zero. So I assumed that when I'm creating a variable it was going to be zero by default. I was wrong. It is only zero by default on the simulator. On the device it won't be, at least for floats. The result was a psychedelic trippy 3d-effect, which were ugly to say the least.

Now I know why Java pesters the programmer to not allow him to use variables which have not been initialized.

Thursday, July 30, 2009

Copying files to boot camp windows drive

I discovered that by default I can only view the files availabe on the Windows boot camp drive when I'm in Mac OS X. I needed to put some files on my Windows drive before booting Windows 7 for a while. After a quick google I discovered an article which mentions NTFS-3g that will allow you to write to any NTFS drive. Apparently it uses MacFUSE underneath.

Tuesday, July 28, 2009

LLVM/CLang static analyzer minor problem with iPhone SDK3

The solution is to just tell scan-build to use gcc 4.2.
The command would then be something like this
scan-build --use-cc=/usr/bin/gcc-4.2 -k -V xcodebuild

Audacity audio caf file problem with iPhone SDK

Today I came across a problem which wasted me an hour of my life! I wanted to add new sound files to my prototype. And I downloaded Audacity to convert some sound files. I exported them as uncompressed CAF files. I had read somewhere that the iphone supports some specific types of encoding, two of which are the U-law and the A-law. I tried both from within Audacity but they didn't won't to play on the simulator/device. But all was good when I saved them as Wav files. Not sure why the CAF files didn't work. Someday I have to dig this problem a bit deeper. For now, I'm happy with the placeholder sounds :)

Bought SVN service subscription

I required another repository as I'm doing a quick prototype of a very simple game I had done once with a friend of mine. I have decided to buy a subscription for, the online SVN service. It has worked pretty well so far and when you upgrade they provide a backup service as well, besides no limitations (diskspace increased to 2g), and multiple repositories. 40$ is a good price for hassle-free SVN service which works without any problems in XCode.

Monday, July 27, 2009

gluLookAt for iPhone

On the iPhone SDK we don't have the glu utility library, so there is no glu functions. The gluLookAt is a very helpful function for conceptualizing a camera. I needed the same behavior since I needed to add some depth. So no more glOrthof, but glFrustumf did not cut it.

After googling a bit I found some gluLookAt implementation which is working wonders.

Saturday, July 25, 2009

No problems with Boot Camp and Windows 7

The hard disk partitioning and installation of Windows 7 through Boot Camp was smooth as silk. No problems whatsoever. Having said that, my main OS of choice will be Mac OS X. Windows 7 will just be there mainly for some unforeseen peculiarities with some windows file, and minor gaming. Minor because I only got the intel GMA 950 on my macbook, so I'm lucky if I can play Quake3 and Halo.

Friday, July 24, 2009

Preparing to install Windows 7 RC on macbook

Before I try installing Windows 7, I decided to make a backup of my macbook using Time Macine. I tried using iTimeMachine to backup over the network. This small app will simply enable TM to be able to use some network shared folder on my Windows macine. After using this small utility I found out that you could easily do a command in the Terminal as will be shown in the link below.

I came across 2 problems when trying to use my Windows machine as a sort of Time Capsule. The first was that I needed to create a sparce bundle myself using Disk Utility and copying it over to the remote folder. That wasn't difficult to do as such as higlighted in the above link

However the second problem wasted me a lot of time. When TM was starting to do the intial backup it would stop complaining about a problem with the network username or password. This didn't make sense at all as I could browse the folder with Finder using the same exact password. Apparently the solution was to reboot my mac and remove all keys to my windows machine (not just the Time Machine System key). Not sure about the reboot whether that helped or not. Right now I'm waiting for TM to finish so I can start installing Windows 7 RC.

Keyframe animation

I had already done some keyframe animation code in java for Swing 'n Strike. I decided to migrate some of the code. The keyframe animation classes consisted mainly of two classes, a Keyframe class and a KeframesCollection class. Each instance of a Keyframe contains information about the position, transparency, frame number/time of the keyframe, what kind of interpolation to use with the next keyframe (linear,easing in/out, etc). The interpolation method I had in java, used to give back an instance of a Keyframe containing the interpolated values. But with Objective-C (actually, because of C), now I could keep a reference to a primitive value, e.g. any int, any unsigned char, any float, or any point... and the interpolate method would directly manipulate that reference. Awesome!

It's quite powerful and easy to use. Just create a keyframes collection. Give an address to what it will be modifying, give it some keyframes, and in the update method, we play the keyframe animation.

Wednesday, July 22, 2009

Faking transparency without alpha channel

I brushed again the gl Blending Functions (glBlendFunc) and I discovered you can fake transparent textures without having an alpha channel. You can have a simple RGB texture with a black background (like a lens flare for example) and then set the blending function to GL_ONE, GL_ONE (Some also make the source factor GL_SRC_ALPHA but since there is no alpha, might as well do it GL_ONE). So the source (pixel which is going to be output) factor and destination (pixel in the existing color buffer) factor are both 1. So if the source (texture) pixel is black (0, 0, 0), and the destination (frame buffer) pixel is red(1, 0, 0), the result would be red(1, 0, 0), i.e. nothing will change in the destination (frame buffer)... more formally the result is Sf * Sp + Df * Dp, where:
S = source
D = destination
p = pixel tuple
f = factor

Using our example, Sf and Df are 1 and we have black (0, 0, 0) and red (1, 0, 0)
1 * (0, 0, 0) + 1 * (1, 0, 0) = (0, 0, 0) + (1, 0, 0) = (1, 0, 0)
Using another example
1 * (0.2, 0, 0) + 1 * (0.4, 0, 0) = (0.6, 0, 0)

So it's just adding them together. If it's completely black, it's as if it is invisible.

Of course for accurate transparencies we need the alpha channel and use GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA so that they are blended together properly. If the alpha of the texture pixel is 0.2f then it would be
0.2 * (0.2, 0, 0) + (1-0.2) * (0.4, 0, 0) = (0.04, 0, 0) + ( 0.32, 0, 0) = (0.36, 0, 0)

glTexEnvi and GL_REPLACE

So I wanted something really simple. A textured object (with alpha channel) and I wanted it to fade it out, and perhaps color it at run time (instead of creating different colored textures).

I knew that I should just give the object texture coordinates, and color information (besides vertex positions). Simple but it wasn't working. It was like it was not taking into consideration the color info at all. So I played around with a demo which I use as my playground and the same code worked. I was going nuts and it was obvious I was missing something. I was looking at the gl commands in the demo, and I had done all of those - enabling blending, enabling textureing, enabling vertex/texture/color arrays, blending function etc. Still nothing. It then clicked that by default it must be working automagically in the demo, and I must have some line of code which is changing the default behaviour and I spotted that line of coding which was setting the texture mode in glTexEnvi to GL_REPLACE which basically will discard the color information. I had completely forgotten about what that line was doing. Ah well...

Sunday, July 12, 2009

Objective-C introduction

I have also found this quick tips for Objective-C at Cocoa Dev Central. Wish I had found this earlier :)

Selection List in UIKit

I also dared using a selection list using the UITableViewController class on top of openGL view. I used a modified version of the generic class by Jeff Lamarche. This video tutorial helped as well to get an idea of how the code is organised for UITableViews.

So now I got a pretty decent on-the-fly level editor for the game. I can also select powerups for enemy groups and draw the enemies paths, etc. Still needs polishing, but it's coming together. This week I will try do a decent level, but for now with just one enemy type, and no boss.

Friday, July 03, 2009

Texture atlas with Photoshop slicing

I needed to create a texture atlas and I wanted to use a tool to read off the coordinates and manage it easily. I tried looking at MapWin but it looked too complicated. So I opted for using Photoshop and just slice the images. But not to create separate images. I just use slicing for easily getting the coordinates and dimensions of the slice. If you double click on a slice with edit slice tool, you will get that information in the dialog box that comes up. That's all I needed. It would be a different story if I had hundreds of images but for now it's manageable. Maybe because it is still a prototype :)

Tuesday, June 30, 2009

Saving and Loading Level Data

I was tempted to use a plist to save and load data, but I want the data to be as disk-space efficient as possible, so I will be doing a binary saving and loading of the data of the elements that will be dropped in a level while using the in-game editor.

NSMutableData/CFMutableDataRef did the trick for saving the data, by using CFDataAppendBytes(). Then I created the file using NSFileManager and simply dumped the contents into the created file

I used NSData for loading the data. I kept track of the offset and passed the pointer of the offset to the objects to load themselves up, and modified that offset which is being passed as a pointer. For loading I simply used getBytes:range, e.g. [data getBytes:&type range:NSMakeRange((*offset)++, 1)];

Mixing UIKit and Opengl

For my in-game editor I wanted to create buttons and some textfields, however I did not want to re-invent the wheel and also did not want to complicate switching between views since the UI for the editor will not be complicated at all.

I know that UIKit elements should not be overlayed on opengl because of a performance hit, but it doesn't seem to effect it that much so far. It's pretty simple to create a UIButton and then just add it to the window: [window addSubview:b];

I created a helper method to quickly create consistent looking buttons and I'm using some iPhone images I extracted from the sdk. I'm quite happy with the cleanliness of UIKit, I must say.

Monday, June 29, 2009

Using iPhone OS images

I've come across this code snippet which extracts the images used by the iPhone OS. You just need to copy paste the code somewhere, e.g. in your delegate, then create the method(selector) signatures in the header file and just execute it when the app finishes loading by calling [self extractArtwork:self];

It takes a couple of seconds to finish, and I would run it from the simulator. You will find all the PNGs in /Users/[you username]/Library/Application Support/iPhone Simulator/User/Applications/[app guid]/Documents/

You can then go through them and see which iPhone OS images you want to use for your app. I suggest you remove the code after doing it once.

Sunday, June 28, 2009

Multitouch start and end events

I just found out that it's not always the case that you may get the multitouch start and end events, i.e. you can get a multitouch move without a multitouch start. Or you could lift off your fingers but not get a multitouch end event. You have to precisely lift off together your fingers to get a multitouch end event. So it's difficult to know whether the user has two fingers or just one on the screen, because he could have two finger on the screen, but only moving one finger, so you would get a single touch event.

This is very annoying. I want to do an in-game quick editor where you can pinch to zoom in/out with two fingers and drag with one finger. However I haven't figured out how to do it properly because of the issue described above. So far I have set a flag to true whenever a multitouch start/move is detected and turned the flag off on the end event. So when this flag is on, I cannot do dragging. However, the user would need to do a proper lift off of his two fingers to trigger a multitouch end event. Will leave it as it is for now, unless I get another idea how to solve the issue.

Friday, June 26, 2009

Instruments and mach_msg_trap

I think I'm starting to get the hang of Instruments, sort of. Ok so in my previous post I didn't realize what that mach_msg_trap was about. Apparently, it's got something to do with the thread doing nothing (blocked), like waiting for the next frame to start (good) or when allocing and releasing a lot of objects (bad).

Will see if the above still holds after I get more experience playing around with Instruments.

Wednesday, June 24, 2009

Optimizing OpenGL ES for iPhone OS

Ok so my shmup prototype is firing a lot of bullets now and I'm seeing that it is slowing down so I tried sampling the app with Instruments but sincerely I didn't make heads or tails of what's happening. The cpu is being used mostly by mach_msg_trap. I tried drilling down but it seemed like it wasn't related with my code. I still have to learn how to read the results of Instruments. For me it doesn't make any sense that my code is like using 10 samples, and other non-related libraries are taking 2000+ samples.

So anyways, after seeing ngmoco's presentation on how to optimize opengl (also refer to apple's documentation), I thought that I should start optimizing my opengl calls. So I created a wrapper class which batches vertices bound with a texture so I can send all the bullets at one go rather than one by one. But that still didn't improve the performance.

It turned out that while I was coding the collision detection for bullets I had a TODO that I had written some weeks ago saying that I should to optimize it :). Basically I was creating several Vector objects and releasing them for every collision check. That was killing the CPU, but strangely enough the Bullet's update method only had 10 samples marked. Again... I need to learn how to read the results from Instruments. After creating some temporary vectors just once and reusing them, improved the performance.

Have to go try understand the results given by Instruments now... otherwise such problems will come back in my face pretty soon. The worse thing would be that I start optimizing code randomly.

Sunday, June 21, 2009

Migrating to iPhone OS 3.0

Ok, so I found some time finally to go through the emails that Apple had sent me regarding the OS 3.0 GM seed, but when I tried accessing the link to see the checklist for migrating an app, it kept on redirecting me to the login page...arghhhhhh. This is also happening if I try to access the forums. Will contact Apple support.

Meanwhile I'm downloading the huge OS 3.0 SDK as well (since I'm still on Leopard... the Snow Leopard version is approx 1.5g smaller). I don't have a clue from where to get the GM seed.

Interesting Blog and iPhone Development Videos

I have found out that Stanford's University have some iPhone development related videos under the iTunes U section. They also had a video on OpenGL optimization specifically for the iPhone delivered by a developer of ngmoco...the creators of several hits on the iPhone including the finger maze game and the topple games. Through this video, I learned about ngmoco's blog. Very interesting.

So I should find some time to implement the mentioned optimizations including
- batching the vertices into one huge array, including texture coordinates and color info rather then sending each object individually
- use texture compression
- use one huge texture
- keeping in mind that I should use less than 24megs texture memory

Accelerometer calibration

Several weeks have passed and I'm really busy with my day job so things are going really slow on the prototype. I've got some basic weapon pickups code, and weapon placement similar to xenon2's mechanism. Will see how it feels later on. I have also migrated some bezier path code I had done for Swing 'n Strike, where I can follow a bezier path with constant speed.

The spaceship is controlled with the accelerometer. I would like to keep the finger off from the screen as much as possible since the game will be quite hectic and a finger on the screen would would obstruct the user's view.

However I would like to calibrate, so to speak, the iPhone's position, i.e. record the offset when the game is launched and assume that position to be the neutral center position of the spaceship. It turned out to be very easy to do, when you know how the accelerometer works. I was puzzled at first and thought that the acceloremeter would only give me values when I moved the iPhone around. It is interesting to know that it will always give you a reading for each axis, roughly between -1 and 1 when it's not moving. Why? Well, gravity (which is acceleration... approx 9.8m/sec^2) is acting on the device). So that 1g acting on it will be represented as a vector acting on the accelerometers. You just record them in some offsets and then subtract that during the game. I'm guessing that astronauts in space are screwed since there will be no gravity acting on the device :)

So in your accelerometer:didAccelerate: method you need to record the starting offsets if you haven't already done so, and the subtract them from the actual readings later on during the game. I also discovered that the very first view readings of the accelerometer are bogus, so I skip the first few frames. (Averaging a bunch of frames would probably be better)

- (void)accelerometer:(UIAccelerometer*)accelerometer didAccelerate:(UIAcceleration*)acceleration {
static int framesLeftToRecord = 30; //the number of frames to skip to record the offset
static BOOL recordOffset = YES; //whether we should record the offset or not
//What we are doing is that we let some time pass to get valid acceleration values as offsets
if (framesLeftToRecord == 0)
if (recordOffset)
NSLog(@"setting start acceloremeter %f,%f,%f",acceleration.x, acceleration.y, acceleration.z);
_startAccelerometer[0] = acceleration.x;
_startAccelerometer[1] = acceleration.y;
_startAccelerometer[2] = acceleration.z;
recordOffset = NO;
} else

//Use a basic low-pass filter to only keep the gravity in the accelerometer values
_accelerometer[0] = (acceleration.x - _startAccelerometer[0]) * kFilteringFactor + (_accelerometer[0]) * (1.0 - kFilteringFactor);
_accelerometer[1] = (acceleration.y - _startAccelerometer[1]) * kFilteringFactor + (_accelerometer[1]) * (1.0 - kFilteringFactor);
_accelerometer[2] = (acceleration.z - _startAccelerometer[2]) * kFilteringFactor + (_accelerometer[2]) * (1.0 - kFilteringFactor);

Monday, June 01, 2009

Started a shoot'em up prototype

Last week I decided that the puzzle prototype I was doing was boring to play so I stopped that and started on a shoot'em up prototype. It will be a 2.5D game, but currently the view is just 2D. I will see how I can change the view soon so enemies appear to fly in, in 3d.

I've got some interested gameplay ideas to take advantage of the touch interface. The major problem I'm seeing for the development of this game is content. A shoot'em up needs to look nice. For now I'm using squares, spheres and some ugly graphics. This simply won't cut it. When the gameplay feels right, I will need someone to help me with the graphics.

Well, at least the quick starfield I've done looks semi-decent :). It's simply three layers of stars, each layer moving slower and the stars simply wrap-around the screen.

Saturday, May 23, 2009

random() on the iphone SDK giving same values

I obviously had changed the random seed with srand(time(NULL)) however when generating 4 random numbers they were always the same although always different with each run. I thought I was going insane as when I tried debugging it step by step the numbers generated were different.

After googling a bit I found that it is better to use arc4random(). That fixed it.

Breakpoints stopped working in XCode

Suddenl, the breakpoints stopped working. Usually I just clean all targets and empty XCode's cache. But this time that trick didn't work. Then I found this solution, to disable Load Symbols Lazily from XCode Preferences->Debugging. That worked like a charm.

Wednesday, May 20, 2009

Got my Apple Developer Program Activation Code

Yippeee! Finally after 3 weeks!

This morning I sent Apple an email telling them that this sending of the activation code was agrivating. I told them that I bought a macbook, switched from windows, bought an ipod touch, but I'm still waiting for the activation code. Later this afternoon I just received the activation code.

Setting up the device was quite straightforward. They provide a very helpful wizard-like experience. But then I got a CodeSign error when I tried Build&Go. I discovered that the Code Signing properties in the project were not set. I just had to select the developer provisioning certificate and presto... I was debugging my buggy prototype on my iPod touch :D

Monday, May 18, 2009

Got my cracked iPod touch

Have been busy preparing for a web development course I'm giving right now.
I've also been busy with making my time logging web app compatible with iphone. I will soon publish it on apple's webapp listings. 

I've got my iPod touch... ok it's cracked, but I will replace the glass later.
It's a pretty cool device, very addictive actually.

I'm still trying to get the fax through the apple developer program...

Saturday, May 09, 2009

iPhone-izing a webapp

I need a break from Objective-C :) after fixing some glitches in the collision detection. I'm still waiting for my iPod-touch and yesterday I resent the fax to enroll to the apple developer program. Hope this time they received it fine.

So I'm currently tempted to convert my mini time management webapp to be iPhone-friendly. My eclipse/rad rails setup is still on my windows machine, so I needed a quick way how to test the changes on a simulated iphone. I found this quick iphone Windows simulator that uses Safari for Windows 3.2 underneath (webkit). It's not perfect, but it does the job for now.

I need the CSS styles and images used to make the iphone native apps look and feel. And in fact there is this iui open source project which gives you the necessary CSS and samples to get you started quickly.

Will see how it goes... the samples provided seem to be exactly what I need!

Wednesday, May 06, 2009

Windows 7 RC on my Macbook

I downloaded the Windows 7 RC which should be fully working until March 2010, after which they said it will start to shutdown every 2hrs.

I found a how-to run Windows 7 RC on a Mac using VMWare but first I downloaded the Windows 7 RC from Microsoft. Just need a Live account.

Ok, why VMWare and not bootcamp? I don't want to depend on Windows. I just want it to be able to run something quickly on Windows without having to reboot. I only plan to do this until I phase out Windows from my life, if I ever manage. I don't want to be switching between computers/OSes. I'm already frying my brain with switching between shortcuts and swapped positions of the minimize/maximize/close buttons.

The installation was a breeze, although it took some time to finish, but maybe because I only have 1gig on my macbook. It's a pain running a win7 with just 512megs reserved for it, when the recommended amount is 1gig! My macbook came to a crawl haha. First time I saw my macbook on its knees. I will be upgrading its RAM this summer. So I will retry running it then. For now I suggest that if you have just 1gig in your macbook, give up and don't waste your time.

Final note, I have an original Windows Vista, and I never got myself to install it. It's crap. I have worked with it for a year where I used to work, 8+ hours a day and it was a nightmare. I think Vista should have remained a beta. But Windows 7 looks promising. Personally I prefer the new Windows taskbar than the Mac's dock.

Tuesday, May 05, 2009

Fast flood fill implementation

I'm keeping an array of bytes (unsigned char) to keep some collision map. I could probably later on improve on it to make it use a bit for each pixel, but for now I'm going to keep it simple.

At first I was going to create a gray scale CGBitmapContext to store the data, but I wasn't going to actually draw it. Just wanted a collision map to be used underneath so I opted for a simple array.

Of course this meant that I needed some Bresenham line algorithm implementation and also a quick floodfill for the game prototype I'm doing. For the floodfill I first did a naive recursive version of it, but that can easily give you a stack overflow. So I did some searches on faster implementations. I found a couple. The QuickFill algorithm seemed a bit overkill to convert it to Objective-C (and looked complicated as well). Remember I'm still prototyping, so it's not the right time for a lot of optimizations. I settled for the implementation found at codecodex which seems to be a compromise between recursive and iterative approach. Seems to do the job for now. Not the fastest, but better than the pure recursive version.

Need to fix some collision detection problems next, and also double check the line algorithm, since I think it's not behaving right at the edges right now.

Monday, May 04, 2009

Mac - I'm Lovin' it

It's been slightly over a month with my second hand MacBook. I'm simply lovin' it. I'm still migrating from windows. I'm only using my windows machine through remote desktop. In fact I unplugged the monitor, keyboard and mouse and hooked them to the laptop :)

Still getting used to XCode, Objective-C and OpenGl but it's fun.

The MacBook came pre-installed with MS Office 2008, but I wasn't impressed with it. I must admit that MS Office is better on the Windows platform, more responsive.

The other complaint I got is when minimizing a window, won't get restored when you switch back to the application. It makes sense when you think that the Macs are application oriented and not window oriented, but still is frustrating having to get hold of the mouse to go over the dock and restore it. The work around using Command-H to hide the window instead.

But besides these minor details, I still think that a mac is better than windows. We'll see how I'll feel in a couple of months down the line.

Saturday, May 02, 2009

Be careful of constructor helper methods

I forgot that constructor helper methods (e.g. +arrayWithCapacity) would make the array autoreleased leading to an EXC_BAD_ACCESS. Still getting used to the reference counting mechanism and conventions :).

I was using such a method for an instance variable. The solution was either to retain it or to use the alloc method, which gives you an object with a retain count of 1.

_replayLines = [[NSMutableArray alloc] initWithCapacity:5];

iPod Touch soon on its way

I have bought a non-badly cracked iPod touch 8gig 2nd generation on ebay - want to keep expenses as low as possible. However since the guy had several of them, he shipped the wrong one. The screen is badly cracked. Will have to replace the glass later.

Can't wait to be able to test on the device directly! Just have to wait for the iPod touch to arrive in Malta from the US, since I had to ship it by proxy to a friend of mine who lives there. Here's a photo he took for me to show me what he called "almost artistic" cracks.

Making reproducible bugs

I got some bugs lurking around related with collision detection. So the first rule when debugging is to make the bugs reproducible. I'm currently trying to record the user interactions and frame number at which they occurred, into an NSMutableArray, and then dumping it to NSDefaults when exiting the application (should be enough for now). Then I will load them back up when the application is relaunched. At the frame number at which the user interactions (basically lines) were originally made, I recreate the lines. Also I had to remove any randomness if I want that each time I play the recorded actions, I get the same outcomes (i.e. bugs). So the ball(s) will always start from the same position.

Will try to finish this interactions playback later on. Have to prepare for next week's lectures.

Tuesday, April 28, 2009

Interesting iPhone Development Presentation

Snappy Touch, a company owned by Noel Llopis who previously has worked on massive console games, is now working on his own start-up. Just a one-person "team" rather than two hundred. He has gone solo and started making iPhone games. He has made an interested presentation at GDC 09 and he shares some knowledge on the process of getting aboard this interesting journey.

Check out the GDC 09 presentation including audio. You can sink the slides manually with the audio pretty easily.

Friday, April 24, 2009

The pains of converting from Java to Objective-C

I'm still compiling a list of the annoyances I encounter when converting from Java to Objective-C, which I will publish soon. But so far the worst one was to be careful when using the math functions like abs instead of fabs! The abs function works with ints and if you try using it with a float, you are going to get some bogus result. Why why aren't such methods overloaded!?!

Wasted a good two hours trying to hunt down such a bug in some code which worked fine in actionscript.

Mental note: remember as well sinf, cosf, asinf, acosf

Migrating current actionscript code to Xcode... check :D
Next task...

Thursday, April 23, 2009

Refactoring in XCode needs improvement

I'm used to Eclipse refactoring, and the refactoring capabilities in XCode compare to Eclipse are abysmal to say the least. Ok, I know Java is much easier to parse, and so the refactoring tools are "easier" to do, but I would expect that at least renaming a field of an Objective-C class would be done across the header and implementation files, and not just the header file. So instead I have to the use the rename dialog box, which of course is very error prone.

Maybe I'm doing something wrong, but I don't see what that could be. I'm just selecting the field, Edit->Refactor->Rename.

Friday, April 17, 2009

Killing bugs before they happen on the iPhone

You can easily set up a Static Analyzer, which will check your Objective-C code for bugs, like memory leaks, dead assignments, etc, without actually running the app on the iPhone (that's why "static"). Check out this tutorial to setup the LLVM/Clang Static Analyzer for iPhone Apps.

You need to dirty your hands slightly with the Terminal to run the analyzer. You might want to add analyzer to your path to your .bash_profile so that you can run the analyzer from anywhere.

When you run the analyzer, you will see a bug report with links to your source files to show you the exact line where you have a bug. Sweet!

Drawing a Textured Line in OpenGL ES

Currently I'm still playing around with the CrashLanding demo app as my sandbox :) Those red spotted lines are my textured lines.

I wanted to draw a textured line and opengl doesn't support drawing textured lines as such. You need to create a polygon (actually two triangles), and texture them.

So I created a simple class which takes two points, calculates the length and angle with the horizontal line. Then I calculate the vertex coordinates and the texture coordinates.

These coordinates are generated as if I am creating a horizontal line. And then before drawing I simply rotate the model view matrix. The vertex coordinates would be something like this
0, -w/2
0, w/2
length, w/2
length, -w/2
where length is the length of the line and w is the width of the line

The texture coordinates would be calculated as follows:
0, 0
0, 1
length / tw, 1
length / tw, 0
where tw is the texture width.

Then draw the two triangles to create the textured line.

glBindTexture(GL_TEXTURE_2D, [_texture name]);
glTranslatef(_point.x, _point.y, 0);
glRotatef(_angle, 0, 0, 1);

glVertexPointer(2, GL_FLOAT, 0, _vertexArray);
glTexCoordPointer(2, GL_FLOAT, 0, _textCoordArray);

glDrawArrays(GL_TRIANGLE_FAN, 0, 4);

Tuesday, April 14, 2009

Adding an SCM SVN Repository to XCode

I needed to set up some subversion. Having a subversion (SVN), or any other source control management is very important. Besides backing up the code remotely, I can quickly diff with previous releases, commit with comments, tag files and code branches, etc.

I googled a bit for free SVN hosting and found After creating my repository I set up XCode to know about my XP-Dev repository. That was a breeze, but then a problem crop up when I wanted to import my existing prototype in the repository. It was complaining about some SCM Error 155007.

Googled a bit more, and luckily some other guys had the same problems. They suggested to do the initial commit from the terminal. This did the trick:

svn import /path/to/project -m “Initial import”

Remember the to do the name when importing, otherwise you won't be able to do a proper checkout. After the import is done, you need to go to XCode>SCM>Repositories and do a checkout to a different directory. It should prompt you to automatically open the XCode project when it finishes.

To check if it's working fine, you can simply type something in some existing file and save, and you should see an M to the left of the Groups & Files sidebar, meaning it is modified and can then be committed at your will :D

Renaming an XCode Project

I found it is pretty painful to rename a project in XCode. Then I found that someone has done a script besides describing how to do it manually. Learn more on renaming an XCode project.

Monday, April 13, 2009

Starting Phase 2 of iPhone project

I have just finished phase 1 for my little iPhone project, which was basically getting acquainted with Mac OS X, XCode, the iPhone Simulator, ObjectiveC, memory management, brushing up some opengl. That was quite a lot! I'm not as proficient as I am with Java on eclipse, but at least I'm starting to walk.

Now I will start on phase 2, which is basically migrating my little game prototype I have done some weeks ago in ActionScript on Flash. It's a very simple game, similar to the old Frenzy game that we used to play on my cousin's old BBC micro. The concept is similar, but you can do diagonal lines, and the prototype I have done so far uses the mouse to draw the lines. The iPhone version will be obviously using the touch interface. When I imagined this game, I always had a touch interface in my head. I have other ideas I need to experiment with to make the game even more interesting.

I better check how to setup some source code management (SCM) on XCode...

Saturday, April 11, 2009

Changing Process Priority on Mac OS X

On Windows I used to bring up the Task Manager (Ctrl-Shift-Esc) and quickly change the priority of a process, e.g. to a lower priority so that some process doesn't make the machine feel slow when some CPU intensive task takes over.

On Mac OS X I thought I would fire up the Activity Monitor and right click to change the priority. Apparently it's not that easy. I fired up google and found out that I need to get the PID from Activity Monitor and then launch a terminal and use the renice command.

E.g. sudo renice +1 -p 3692

A process can have a priority between +20 (lowest priority) and -20 (highest priority). 0 is the default priority. The above will make the the process with PID 3692 with a slightly lower priority.

Wednesday, April 08, 2009

Migrating Shortcuts from Windows to Mac

Copy Pasting
Instead of Control use the Command Key (⌘). So Ctrl-C becomes ⌘-C on the Mac.
First of all, if you plugged in a Windows keyboard, the Start button acts as the ⌘ key.

Undo Redo
The Undo shortcut is similar, ⌘-Z however the Redo you need to hold the shift key while doing ⌘-Z.

The hash/pound # key
To type the hash / pound # key, you need to press the Alt together with the 3. I got stuck on this when I wanted to do a #import :)

Navigating Text
In Windows Ctrl-arrow keys would navigate the text word by word. On the Mac you need to use the Alt instead of the Ctrl. (Luckily on XCode, the Ctrl shortcut does the same thing). The End and Home keys will take you the very end and start position of the whole text. Very annoying since on Windows we are used to be related with the current line being edited. To go to the end or start of the line you need to press Ctrl together with an arrow key.

Quick Dark Scheme
In Windows I use WindowBlinds and my Dark Aqua scheme, (inspired from the Mac). There are some skinning applications on the Mac, but I haven't found what I'm looking for to be able to switch completely to a black skin. However there's a quick native shortcut which will invert the colors, and so if you feel that your eyes are getting strained during those late night coding hours, you can easily switch to a black scheme by hitting Ctrl-Alt-⌘-8.

Killing processes
You can either launch Activity Monitor to replace Task Manager. Or you can launch the Force quit menu by pressing Alt-⌘-Esc.

Fell in Love with the Mac

This is going to be a brain dump of the last two weeks using the MacBook.

I have been a Windows user from the beginning. I didn't like Macs a lot ("where's the right mouse button?"), until I saw the release of Mac OS X. And finally I got my own MacBook. It felt like an alien dropped by and gave me his laptop. Every shortcut I knew in Windows needs to be rewired in my brain.

Very nice machine. Besides being impressed by the hardware and OS, I was mostly impressed by getting started to develop for the mac and iphone/ipod touch.

I love the fact that I can drop into a terminal window and have unix commands at my fingertips. I have done some remote unix server admin, although I still need to learn a lot in this area.

Installing applications is dead easy...dragging into the Applications folder. You want to uninstall? You just remove it from the Applications folder! Hah. Although I have found a small application called AppDelete which will make sure to remove any config files, etc which won't be used any more.

Why oh why didn't I switch to a Mac ages ago? Money is always the problem.

I got the Macbook for development. So I got a second hand Macbook and I want to plug it to a keyboard and monitor (considering KVM switch until I put my Windows machine down). I also got a Mini-DVI to DVI connector, so now I'm using SGI 1600sw as the primary monitor and the Macbook as the secondary monitor. The SGI is quite old now but IMHO it is still a very good monitor :D. I like this dual monitor setup... on the macbook I keep the documentation and on the SGI I keep XCode windows.

Which reminds me about XCode and Objective-C. Currently I'm learning about Objective-C which apparently was released in the same year as C++, but I didn't hear about it before this year. It has some interesting concepts like categories, and I missed the fact of passing functions as parameters. I have used Java to much now. Got me lazy with that garbage collector :) Finding it annoying now to have to use reference counting to properly release variables. Got a lot to learn...

I started an OpenGL ES application for the IPhone. That just only means that I let XCode generate the template :) I'm still looking at tutorials. First stumbling block was drawing text. There are two ways how to do that. You either create a texture for the text you want to show, but that leads to a problem if you are changing the text every frame plus its memory inefficient. The second approach is creating a single texture containing all the characters and keep track of each characters kerning and spacing. Ughhh I had done the latter ages ago on Windows. Guess someday I will need to reimplement it unless I find some ready made class (which I have found yet). For now I will create a texture for each string :(.

Monday, March 23, 2009

Getting a macbook!

It's been a while. Have been busy working with a game company last year, and this year I'm working as a lecturer. Teaching students takes a lot of my time, but hopefully during the summer time I will have some time to do some small game project.

I'm hoping of developing something small on the iphone and try my luck publishing on the app store. It's not for the money, it's for the fun of it. The iphone is a pretty cool device and I've hear good things about its APIs. I've already got a quick prototype in Flash, but now I will need to port it to iphone SDK.

First things first... I need some mac machine. So I just bought a second hand macbook through ebay. Waiting for it to arrive on this island! Let the iphone adventure begin...