iosdev

Color of raw image pixel data and iPhone 4's retina display

In my Ambient Mood Lamp app, I have a color picker where you can choose the background color by simply tapping the color on an image, like this:

Color swirl

What I need from there is the actual RGB representation of the pixel color. To get that, there’s quite a bit of code involved. A large part of it is taken from Apple’s technical note QA1509.

Pixel color fetching is done in this if block from that article:


if (data != NULL)

{

    // **** You have a pointer to the image data ****

    // **** Do stuff with the data here ****

}

I picked up the code to get the pixel color from some web page I lost track of. This is the code:


int offset = ((w*round(point.y))+round(point.x)) * 4;

int alpha =  data[offset];

int red = data[offset+1];

int green = data[offset+2];

int blue = data[offset+3];

color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)];

w is the width of one row of data, and point {x,y} is where the screen was touched. The * 4 in the first line means 4 bytes of raw data per pixel. Well, 4 bytes when your screen res is up to 160ish ppi. On iPhone 4’s Retina Display, with its 326ppi resolution, this should be 8. Which means correct code now is:


int offset = ((w*round(point.y))+round(point.x)) * 4 * [[UIScreen mainScreen] scale];

Welcome to wonderful world of resolution independent programming.