Get pixels and colours from NSImage

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP



Get pixels and colours from NSImage



I have created an NSImage object, and ideally would like to determine how many of each pixels colour it contains. Is this possible?




6 Answers
6



I suggest creating your own bitmap context, wrapping it in a graphics context and setting that as the current context, telling the image to draw itself, and then accessing the pixel data behind the bitmap context directly.



This will be more code, but will save you both a trip through a TIFF representation and the creation of thousands or millions of NSColor objects. If you're working with images of any appreciable size, these expenses will add up quickly.





For new lazy readers: This solution is Implemented in @gavinb's answer.
– jjabba
Apr 26 at 1:29



Get an NSBitmapImageRep from your NSImage. Then you can get access to the pixels.


NSBitmapImageRep


NSImage


NSImage* img = ...;
NSBitmapImageRep* raw_img = [NSBitmapImageRep imageRepWithData:[img TIFFRepresentation]];
NSColor* color = [raw_img colorAtX:0 y:0];





This is a very expensive approach, as colorAtX:y: will involve creating an NSColor instance for each pixel, as Peter Hosey notes. It is much more efficient to get the raw data buffer and walk through using pointers to calculate the histogram.
– gavinb
Jan 3 '10 at 10:28


colorAtX:y:


NSColor





Hi gavinb, do you have any directions (get the raw data buffer and walk through using pointers) on this one? Thank you!
– RickON
Jan 5 '14 at 10:22





@clearlight Well I didn't realise I was keeping anyone in suspense. I wrote a sample app which does just what is described above, and added the code in an answer to this question. I've published it on github too. Enjoy!
– gavinb
Apr 5 '17 at 13:21





@clearlight See my new answer below stackoverflow.com/a/43232455/172642 - this basically implements what @Peter Hosey describes in his answer. Using the NSColor colorAtX:y is definitely not how this should be done.
– gavinb
Apr 5 '17 at 13:53


NSColor colorAtX:y



This code renders the NSImage into a CGBitmapContext:


NSImage


CGBitmapContext


- (void)updateImageData

if (!_image)
return;

// Dimensions - source image determines context size

NSSize imageSize = _image.size;
NSRect imageRect = NSMakeRect(0, 0, imageSize.width, imageSize.height);

// Create a context to hold the image data

CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);

CGContextRef ctx = CGBitmapContextCreate(NULL,
imageSize.width,
imageSize.height,
8,
0,
colorSpace,
kCGImageAlphaPremultipliedLast);

// Wrap graphics context

NSGraphicsContext* gctx = [NSGraphicsContext graphicsContextWithCGContext:ctx flipped:NO];

// Make our bitmap context current and render the NSImage into it

[NSGraphicsContext setCurrentContext:gctx];
[_image drawInRect:imageRect];

// Calculate the histogram

[self computeHistogramFromBitmap:ctx];

// Clean up

[NSGraphicsContext setCurrentContext:nil];
CGContextRelease(ctx);
CGColorSpaceRelease(colorSpace);



Given a bitmap context, we can access the raw image data directly, and compute the histograms for each colour channel:


- (void)computeHistogramFromBitmap:(CGContextRef)bitmap

// NB: Assumes RGBA 8bpp

size_t width = CGBitmapContextGetWidth(bitmap);
size_t height = CGBitmapContextGetHeight(bitmap);

uint32_t* pixel = (uint32_t*)CGBitmapContextGetData(bitmap);

for (unsigned y = 0; y < height; y++)

for (unsigned x = 0; x < width; x++)

uint32_t rgba = *pixel;

// Extract colour components
uint8_t red = (rgba & 0x000000ff) >> 0;
uint8_t green = (rgba & 0x0000ff00) >> 8;
uint8_t blue = (rgba & 0x00ff0000) >> 16;

// Accumulate each colour
_histogram[kRedChannel][red]++;
_histogram[kGreenChannel][green]++;
_histogram[kBlueChannel][blue]++;

// Next pixel!
pixel++;




@end



I've published a complete project, a Cocoa sample app, which includes the above.



Look for "histogram" in the Core Image documentation.



Using colorAtX with NSBitmapImageRep does not always lead to the exact correct color.


colorAtX


NSBitmapImageRep



I managed to get the correct color with this simple code:


[yourImage lockFocus]; // yourImage is just your NSImage variable
NSColor *pixelColor = NSReadPixel(NSMakePoint(1, 1)); // Or another point
[yourImage unlockFocus];



This maybe a more streamlined approach for some and reduce complexity of dropping into memory management.



https://github.com/koher/EasyImagy



Code sample
https://github.com/koher/EasyImagyCameraSample


import EasyImagy

let image = Image<RGBA<UInt8>>(nsImage: "test.png") // N.B. init with nsImage

print(image[x, y])
image[x, y] = RGBA(red: 255, green: 0, blue: 0, alpha: 127)
image[x, y] = RGBA(0xFF00007F) // red: 255, green: 0, blue: 0, alpha: 127

// Iterates over all pixels
for pixel in image
// ...




//// Gets a pixel by subscripts Gets a pixel by
let pixel = image[x, y]
// Sets a pixel by subscripts
image[x, y] = RGBA(0xFF0000FF)
image[x, y].alpha = 127
// Safe get for a pixel
if let pixel = image.pixelAt(x: x, y: y)
print(pixel.red)
print(pixel.green)
print(pixel.blue)
print(pixel.alpha)

print(pixel.gray) // (red + green + blue) / 3
print(pixel) // formatted like "#FF0000FF"
else
// `pixel` is safe: `nil` is returned when out of bounds
print("Out of bounds")






By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Popular posts from this blog

Firebase Auth - with Email and Password - Check user already registered

Dynamically update html content plain JS

Creating a leaderboard in HTML/JS