I recently needed to figure out how to take a photo with the camera, and then desaturate it. I needed to control the level of desaturation, ranging from completely greyscale, all the way up to the original image. This turned out to be trickier than expected, but in the end the solution was quite simple.
The first useful looking bit of code I found was on StackOverflow. This code was quick and easy to implement, and works nicely, but unfortunately gives no control over the amount of desaturation. It completely converts the image to greyscale.
StackOverflow then produced another promising looking solution. The DesatView seemed exactly what I was after – create the view, assign the image, and set the desaturation level. Unfortunately, even setting the level to 100% seemed to result in purple blotches on my photos. I’m not sure why, but it seemed to be altering the hues somehow, and I didn’t know enough about it (or have the time) to figure out why.
Next, I came across the GLImageProcessing sample code from Apple. This is an OpenGL ES project, which demonstrates how you can alter the brightness, contrast, saturation, hue and sharpness of an image. As you would expect, the sample code works really nicely, and in fact allows you to both saturate and desaturate an image. I don’t have any OpenGL ES skills but it wasn’t too hard to integrate Apple’s code into my project. Unfortunately for some reason I just couldn’t get the saturation working. I had no problem with brightness and contrast, but saturation just wouldn’t work for me. I’m sure I was doing something wrong, but time was against me.
The solution I settled on turned out to be pretty simple, and in fact used the first greyscale method mentioned at the top of this post. The answer is to create two UIImageViews, with one on top of the other. Assign the same UIImage to both, and then convert the bottom UIImageView to greyscale. By then altering the opacity of the top UIImageView, you get a cheap and easy desaturation effect. If you need 40% desaturation, set the alpha value of the top UIImageView to 0.6. Simple.
If you then need to create a UIImage out of the result, this is pretty easy too:
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.opaque, 0.0);
UIImage* desaturatedImage = UIGraphicsGetImageFromCurrentImageContext();
This creates a UIImage object out of whatever is currently displayed in the UIView object – you can even use this for subviews, if you don’t want to save out your whole UIView.
This turned out to be a good lesson for me. There’s always more than one way to do things, and the simplest solutions are often the best. It may not be the most performant solution, or even the most intuitive, but the main thing is that it solves the problem. The bonus is that it’s quite an elegant way of doing it, as well.