Blog

Thoughts on mobile app development.

How to Desaturate a UIImage

I recently needed to figure out how to take a photo with the camera, and then desaturate it. I needed to control the level of desaturation, ranging from completely greyscale, all the way up to the original image. This turned out to be trickier than expected, but in the end the solution was quite simple.

The first useful looking bit of code I found was on StackOverflow. This code was quick and easy to implement, and works nicely, but unfortunately gives no control over the amount of desaturation. It completely converts the image to greyscale.

StackOverflow then produced another promising looking solution. The DesatView seemed exactly what I was after – create the view, assign the image, and set the desaturation level. Unfortunately, even setting the level to 100% seemed to result in purple blotches on my photos. I’m not sure why, but it seemed to be altering the hues somehow, and I didn’t know enough about it (or have the time) to figure out why.

Next, I came across the GLImageProcessing sample code from Apple. This is an OpenGL ES project, which demonstrates how you can alter the brightness, contrast, saturation, hue and sharpness of an image. As you would expect, the sample code works really nicely, and in fact allows you to both saturate and desaturate an image. I don’t have any OpenGL ES skills but it wasn’t too hard to integrate Apple’s code into my project. Unfortunately for some reason I just couldn’t get the saturation working. I had no problem with brightness and contrast, but saturation just wouldn’t work for me. I’m sure I was doing something wrong, but time was against me.

The solution I settled on turned out to be pretty simple, and in fact used the first greyscale method mentioned at the top of this post. The answer is to create two UIImageViews, with one on top of the other. Assign the same UIImage to both, and then convert the bottom UIImageView to greyscale. By then altering the opacity of the top UIImageView, you get a cheap and easy desaturation effect. If you need 40% desaturation, set the alpha value of the top UIImageView to 0.6. Simple.

If you then need to create a UIImage out of the result, this is pretty easy too:

[code]
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.opaque, 0.0);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* desaturatedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[/code]

This creates a UIImage object out of whatever is currently displayed in the UIView object – you can even use this for subviews, if you don’t want to save out your whole UIView.

This turned out to be a good lesson for me. There’s always more than one way to do things, and the simplest solutions are often the best. It may not be the most performant solution, or even the most intuitive, but the main thing is that it solves the problem. The bonus is that it’s quite an elegant way of doing it, as well.

5 Comments

  1. Absolutely Brilliant !!! Hope I can use that in some project :D

    Reply
  2. i’d be concerned with the memory allocation of 2 separate instances of a large resolution photo!

    Reply
    • Fair point – I probably should have mentioned that I was working with a scaled photo. I only needed the final 320 pixels wide image. If you needed to keep the full resolution photo though, you’d certainly have to take care and this likely wouldn’t be a good solution for you.

      Reply
  3. Neat trick, but I was hoping for a more general solution in which you don’t have access to the source file (i.e. downloading the image from a server).

    Reply
    • There shouldn’t be any reason you can’t use this method for a downloaded image. Just download the image, store it in a UIImage object, and place into two UIImageViews.

      Reply

Submit a Comment

Your email address will not be published. Required fields are marked *