1 min read

How to get the color of a pixel from an NSImage

In a recent project for a client, I had to implement a custom color picker. The process boils down to taking a screenshot and picking the color of the pixel, corresponding to the mouse position, in the generated image.

So let’s look at two options on how to get the color of a pixel in an image.

  1. The first option consists of using the NSBitmapImageRep class:
extension NSImage {
    func getPixelColor(at pos: NSPoint) -> NSColor? {
        guard let cgImage = self.cgImage(forProposedRect: nil, context: nil, hints: nil) else {
            return nil
        }
        return NSBitmapImageRep(cgImage: cgImage).colorAt(x: Int(pos.x), y: Int(pos.y))
    }
}
  1. The second option consists of directly accessing the image data in memory:
extension NSImage {
    func getPixelColor(at pos: NSPoint) -> NSColor? {
        guard let cgImage = cgImage(forProposedRect: nil, context: nil, hints: nil) else {
            return nil
        }
        let data: UnsafePointer<UInt8> = CFDataGetBytePtr(cgImage.dataProvider!.data)
        
        let bytesPerPixel = cgImage.bitsPerPixel / 8
        let pixelInfo = ((Int(size.width) * Int(pos.y)) + Int(pos.x)) * bytesPerPixel
        
        let r = CGFloat(data[pixelInfo]) / CGFloat(255.0)
        let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
        let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
        let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)
        
        return NSColor(red: r, green: g, blue: b, alpha: a)
    }
}

It’s worth noting that these methods can return slightly different colors.