This Is What It Looks Like When a Neural Net Colorizes Photos
The results are spotty, but when they turn out well they're actually pretty impressive.
Timothy J. Seppala | April 2, 2016
Example input grayscale photos and output colorizations from our algorithm. These examples are cases where our model works especially well. For randomly selected examples, see the Performance comparisons section below. Source: http://richzhang.github.io/colorization/ |
<more at http://www.engadget.com/2016/04/02/neural-net-colorizes-black-and-white-photos/; related links and articles; http://richzhang.github.io/colorization/ (Colorful Image Colorization. Richard Zhang, Phillip Isola, and Alexei A. Efros. [Abstract; Given a grayscale photograph as input, this paper attacks the problem of hallucinating a plausible color version of the photograph. This problem is clearly underconstrained, so previous approaches have either relied on significant user interaction or resulted in desaturated colorizations. We propose a fully automatic approach that produces vibrant and realistic colorizations. We embrace the underlying uncertainty of the problem by posing it as a classification task and explore using class-rebalancing at training time to increase the diversity of colors in the result. The system is implemented as a feed-forward operation in a CNN at test time and is trained on over a million color images. We evaluate our algorithm using a "colorization Turing test", asking human subjects to choose between a generated and ground truth color image. Our method successfully fools humans 20% of the time, significantly higher than previous methods.]) and http://www.engadget.com/2016/02/28/google-neural-network-locates-photos/ (Google neural network tells you where photos were taken. PlaNet doesn't need obvious landmarks to locate an image. February 28, 2016)>
No comments:
Post a Comment