I've been writing an image processing program which applies effects through HTML5 canvas pixel processing. I've achieved Thresholding, Vintaging, and ColorGradient pixel manipulations but unbelievably I cannot change the contrast of the image!
I've tried multiple solutions but I always get too much brightness in the picture and less of a contrast effect and I'm not planning to use any Javascript libraries since I'm trying to achieve these effects natively.
The basic pixel manipulation code:
var data = imageData.data;
for (var i = 0; i < data.length; i += 4) {
//Note: data[i], data[i+1], data[i+2] represent RGB respectively
data[i] = data[i];
data[i+1] = data[i+1];
data[i+2] = data[i+2];
}
Pixel manipulation example
Values are in RGB mode which means data[i] is the Red color. So if data[i] = data[i] * 2; the brightness will be increased to twice for the Red channel of that pixel. Example:
var data = imageData.data;
for (var i = 0; i < data.length; i += 4) {
//Note: data[i], data[i+1], data[i+2] represent RGB respectively
//Increases brightness of RGB channel by 2
data[i] = data[i]*2;
data[i+1] = data[i+1]*2;
data[i+2] = data[i+2]*2;
}
*Note: I'm not asking you guys to complete the code! That would just be a favor! I'm asking for an algorithm (even Pseudo code) that shows how Contrast in pixel manipulation is possible!
I would be glad if someone can provide a good algorithm for Image Contrast in HTML5 canvas.
A faster option (based on Escher's approach) is:
function contrastImage(imgData, contrast){ //input range [-100..100]
var d = imgData.data;
contrast = (contrast/100) + 1; //convert to decimal & shift range: [0..2]
var intercept = 128 * (1 - contrast);
for(var i=0;i<d.length;i+=4){ //r,g,b,a
d[i] = d[i]*contrast + intercept;
d[i+1] = d[i+1]*contrast + intercept;
d[i+2] = d[i+2]*contrast + intercept;
}
return imgData;
}
Derivation similar to the below; this version is mathematically the same, but runs much faster.
Original answer
Here is a simplified version with explanation of an approach already discussed (which was based on this article):
function contrastImage(imageData, contrast) { // contrast as an integer percent
var data = imageData.data; // original array modified, but canvas not updated
contrast *= 2.55; // or *= 255 / 100; scale integer percent to full range
var factor = (255 + contrast) / (255.01 - contrast); //add .1 to avoid /0 error
for(var i=0;i<data.length;i+=4) //pixel values in 4-byte blocks (r,g,b,a)
{
data[i] = factor * (data[i] - 128) + 128; //r value
data[i+1] = factor * (data[i+1] - 128) + 128; //g value
data[i+2] = factor * (data[i+2] - 128) + 128; //b value
}
return imageData; //optional (e.g. for filter function chaining)
}
Notes
I have chosen to use a contrast range of +/- 100 instead of the original +/- 255. A percentage value seems more intuitive for users, or programmers who don't understand the underlying concepts. Also, my usage is always tied to UI controls; a range from -100% to +100% allows me to label and bind the control value directly instead of adjusting or explaining it.
This algorithm doesn't include range checking, even though the calculated values can far exceed the allowable range - this is because the array underlying the ImageData object is a Uint8ClampedArray. As MSDN explains, with a Uint8ClampedArray the range checking is handled for you:
"if you specified a value that is out of the range of [0,255], 0 or 255 will be set instead."
Usage
Note that while the underlying formula is fairly symmetric (allows round-tripping), data is lost at high levels of filtering because pixels only allow integer values. For example, by the time you de-saturate an image to extreme levels (>95% or so), all the pixels are basically a uniform medium gray (within a few digits of the average possible value of 128). Turning the contrast back up again results in a flattened color range.
Also, order of operations is important when applying multiple contrast adjustments - saturated values "blow out" (exceed the clamped max value of 255) quickly, meaning highly saturating and then de-saturating will result in a darker image overall. De-saturating and then saturating however doesn't have as much data loss, because the highlight and shadow values get muted, instead of clipped (see explanation below).
Generally speaking, when applying multiple filters it's better to start each operation with the original data and re-apply each adjustment in turn, rather than trying to reverse a previous change - at least for image quality. Performance speed or other demands may dictate differently for each situation.
Code Example:
function contrastImage(imageData, contrast) { // contrast input as percent; range [-1..1]
var data = imageData.data; // Note: original dataset modified directly!
contrast *= 255;
var factor = (contrast + 255) / (255.01 - contrast); //add .1 to avoid /0 error.
for(var i=0;i<data.length;i+=4)
{
data[i] = factor * (data[i] - 128) + 128;
data[i+1] = factor * (data[i+1] - 128) + 128;
data[i+2] = factor * (data[i+2] - 128) + 128;
}
return imageData; //optional (e.g. for filter function chaining)
}
$(document).ready(function(){
var ctxOrigMinus100 = document.getElementById('canvOrigMinus100').getContext("2d");
var ctxOrigMinus50 = document.getElementById('canvOrigMinus50').getContext("2d");
var ctxOrig = document.getElementById('canvOrig').getContext("2d");
var ctxOrigPlus50 = document.getElementById('canvOrigPlus50').getContext("2d");
var ctxOrigPlus100 = document.getElementById('canvOrigPlus100').getContext("2d");
var ctxRoundMinus90 = document.getElementById('canvRoundMinus90').getContext("2d");
var ctxRoundMinus50 = document.getElementById('canvRoundMinus50').getContext("2d");
var ctxRound0 = document.getElementById('canvRound0').getContext("2d");
var ctxRoundPlus50 = document.getElementById('canvRoundPlus50').getContext("2d");
var ctxRoundPlus90 = document.getElementById('canvRoundPlus90').getContext("2d");
var img = new Image();
img.onload = function() {
//draw orig
ctxOrig.drawImage(img, 0, 0, img.width, img.height, 0, 0, 100, 100); //100 = canvas width, height
//reduce contrast
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.98);
ctxOrigMinus100.putImageData(origBits, 0, 0);
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.5);
ctxOrigMinus50.putImageData(origBits, 0, 0);
// add contrast
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .5);
ctxOrigPlus50.putImageData(origBits, 0, 0);
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .98);
ctxOrigPlus100.putImageData(origBits, 0, 0);
//round-trip, de-saturate first
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.98);
contrastImage(origBits, .98);
ctxRoundMinus90.putImageData(origBits, 0, 0);
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.5);
contrastImage(origBits, .5);
ctxRoundMinus50.putImageData(origBits, 0, 0);
//do nothing 100 times
origBits = ctxOrig.getImageData(0, 0, 100, 100);
for(i=0;i<100;i++){
contrastImage(origBits, 0);
}
ctxRound0.putImageData(origBits, 0, 0);
//round-trip, saturate first
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .5);
contrastImage(origBits, -.5);
ctxRoundPlus50.putImageData(origBits, 0, 0);
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .98);
contrastImage(origBits, -.98);
ctxRoundPlus90.putImageData(origBits, 0, 0);
};
img.src = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAMAAABHPGVmAAADAFBMVEX0RydFRjuPweRoak+awuFzbknzUj9SV0T0RRVtb1lcak7zUTSUmX94hnOYoHFacmpCPS9zwvOBkG5Zbl7xVCCVxuhnfnV7fFOnoqU3OTFQZVNNXlCmxeBtfWlhaFlbXUmfyOVrh3/vXUiBfUcrKCF9wu50fXB4fGVlcGJIUEvtSDCUvd2Up4uJnYBUX0A7QzjtW1tYZUlqY0U1Miivy+BfdXJ/hWtuc2eNjWSCkWFbYVVhcU6Pl2tsf13xVUxyueeir4N+lICIkXdIUjuxr7JyfH1iemiHhFT5Rx+7zN2Xn2KCf11jeFl1jYhSZ11OTkGJxe6Kt9ybrp6fqZBkfoZ/in53jntBTUN4h2FPTDFFRy+GwOeju9iVp35dWj1uh25gZWqNmF9JWUX3SjOvxNtyjZlYeJdqrNeMn3F2dFmNj1KWkktFV1W7vbd/l4ydkIVOXWDgVEjwWzlqb3WXq6/Bp6OHlJuWn5aKmotSZ2t/jVWRgFKmkU92eUF+rdaKrNCnsqVnh5BqeEyHh0POmzxiZDuotbqkrW+lnmvEo1BafqZ3gYnmZWKonFyZjlvvZ1Xev0hlu+5aksKyr5RLaI+YkmrXZmqYnFa3l0tXVDZpXzGFoqWlkZa3hYHrbmbXuFGLo5Z4hU3UqkdhbUBvbThxoM29u6SVoaNVbXziXlRVVVRZndPNsl1/uOHCsbB5mJWSjZE9PyWWs9egpX+st36lm3eysnJ6bTdxkb5gcYfSWlMfGhxpwPdloLp8mqWCi4++s344SVKih0aYfT6JcTy/TzwsLzZAMR+drM2/kJGrnpBafntHYnk1PUNfVC3SUCpfr+XTc4OGeXnjaHSlqV+sgzm9mpviSTlRg7Rrh7HmdXbCdXDSeW5AVWy2pWHZa1ynt5Hge5DNiouMgIbGtGq6w86+ZFmkV06+jzxOQyXWiKKzcGeFosRgkajdrzqddHOEcFN6SCTRig2dRCbjRiK1ahXMnLV6VzuEXFyeaiy8voxgPh/CfBK6lK6ciHeFa2q3ecQCAAAjI0lEQVRo3iRVf2gbZRjOGUroaO5iTq8mx6WGxDTBpGGNl+baFJOGpXhtLs4mzY+RxjvMTJNQAw5tTuJsjnCiKAiTsNSBoVFGRWpsyJBuheps/oiNIhEEHQQFR9Hi2lFq14J+6sMH33HHve/7vD+eV9KO7Jav6nTVfL6cFuhiZYU/UzmTNCvNbkIbJ1B0HJ12E74FrRdDFxgkgfgwBlkKrG36tFoEO+dbY9ZaGGJktJtrmUu+hdmg3Y0Nmp2LdaVIizCvFFPF4YLkpFy9WtTpdOl0viRUK2yFpyrOsEW5uIChLVxrIUaMC1oMR7ksgjMkjobi4MIQn5pEsr6cOoRkGBJpBRKbl5Z9pGFBHjQHzcqusuFQsmFRGXZ0+GJeslu+kS+lBSFP06UqTYtjcYof01hmUR/mXcPOzA6ZvCTJQFaGUfs9fivHWT1qJhDKJsjcEo4wiDZHtgKB5cRy4pI2Lp8YIeTmcaUTnocbIlxX9vMiL6xI2rvtok6odPg0neJTYpVyOQthftZ4aTmhZRBjKIubEMiPk4jMpIayIavJ6gkxrcCty0ubawGEJAPLm8trr1z6Y2lzOeHjxs5qzEbY7l6P1XVUfV0s6XQFnpLsttt5oZRKpdOlEptiKZ6OVobOElo1ksNa5zASRWUQTiYCATK7xLSWSDLRWtq8fP7y8iVfC/rCt7m5mUgktCSJYAEUw0zucY1RA/dSsDNJTcVYlk8W8nlJu3yjmq/k2U4lla+wPM+OrRSkQ2rvKIliWhLL4iiSg0xMNrAUWP5jOcEgy3ubfyQubyImtTVE5pAsYwqZOEaN+9WQWnY1MkBopNL+nv6+usslOhsuuLAyB5xUK4U0O9kQOwIrUvmVWmT67JgfHYI4DpJZTR61TKaIKqDQKqkFfhJqda6VWMJQT8TjUaNciOM4k9/jiQIoIDUyKpMW5vqdFCvWO7Qz6ehJ83MUKHw1nU5R5XQJpKo0VChIr8qGIpGoAhx/TVFTNJvgKdpUQNcDJB7KqlW5AOOJAJsej4eDojarAoIUCigalXlqNQXkV1w1jivtDidP0XXluK5U5PslN/JzVSGv69AlgRallamiTCPzN8FvEUXTFm2qFDab4n/4FX5U4YnW/jtNGyCnCkUi5Rv7J7vWmoKDIBXgA/mNoL/6Vkou18T6A5eSoni2KImAOUkL777zDl2ZpHlW2a8ag0wyK/RvfCrV9m7z5KTZtAGoVAoVsKzGEc4D2WzDGky9bTw8PT48OP0BtelRz79sOJnf2CuPycMs7FDW63SHH6AKZUk1JayUUg6xmE/RIp3v6ZH6QUw4JAtZVfrt6uHp4f7h4fGOCYP0tppN4RnycB6PCpqfHx9SbcMHB8Pzxwenyh0NET+LAxCEN2iZmFY67esPlFLRORhzshWJcEMQ0pNsp0E3XC44SQ0MQP5VyGZVezirbefLw9PmHPBCbuPjMj1w8y+iNWi4J+y16vXJ04Pjk8PTU/mOafHDOMOgCBlfDbonYkkYnqAqLFunpAVpv6QslIr5fl2p02nAohPuGzSARuc8HEoYxzzbwunB4f3Dwx9MG3pjb6Rpa9aioOa1SFkaj25sbOA/HACcfpXTjy6a3STqzRmC8t6ZCSrspGjKOQ7TKTa/kpcIc8JwOiUUHWyn46Rpe2wwGMRQNOddtKj1evzwPyv3bBsKPwE19fpapNau1dq9YS4yHNHrZcfg62ROFcJmPoyToyjmm0XNFioZg9c7rJi0jHlA5XnJbaGtK5fSOoF2pkSXE7bPzwQXnkdaS2uzGpCQ0D7gsq/XN3cjmjNNcM8Vd6PtI/k095pOZbOBF03r9g7iXTcbRnCvL240PBeG+5LJcLI7E0vyNGx3FouS2zeAcqXSAtxolHSlyZTDbggaZjEExGQfUls39Bv/wX9WymmN+vbubi1yNHyVwCHOmLGGrDs7W6YtbTz4qtsoY2QGr1zrtiyau65uzNJ9QFN0QckDWbnd1glCzzudfLpRSoNKwQ4i2Of1erVo3DKNqq2ZzNZWJpNdWHh11qRoHp3st4+O7o14VEME2AAtJsNkEa07+GoQz6CjBEaMeONds8VZX5+BKTEZc5b6K5WKZHclBeQxVS9R9QbLlh7ABTgYXJz2xlHES/R64xyOqhnjyz65mtNbd3ua+6/t3/7Tx3hVGu3CWhDDEHnw1ZlFjUmLoASGy71yi91icXVnZrousyuZhGkHOySp6srptK6fFTssC3YZLFLzU4Q9KCfPkV4ZITXOEoOG4MjbWEBrQvRz905O/j4+/uvtpVyLw02ake9f+uyDx37tlWUx4EOuWZCfkxOxriVMwV1XjIVFsFAc/LRkF+wsXXVSpEG3daZ6qKTTPTGxOGLADFovbuUG7d/b7YOG0dVVBjGFRu7t37129/j26nIrlPnm4t7e3lsXv1i1qTItBGxLwjholGN9MaWSguszgEa9HoMtlKMgKbfTQk9HLDlZGoZddZ5PJgeDSoPh3PMEpsVDJqvfD5nuvHVxa1uVMRrunhxd++349wEDY3vrxfce/hffvfjW9RyDY5rxEa+bMLrNix++YVHOzFjqrgegv4Cyj0mK5XQqVUhVikJJl4cbIF8wOzY68rKBxI0hyLpz58qPP/64d+v8hT391tsv3W0eXTs+/P2vkNW298Sjjz7+6IXXby0FrmczS1oGnT6jwdxueczi+viN51wPzN1YLOkc7uFXwMQPT+VT9UYq1Uk56Tcd/evvw30yVGs0DHAmq/XOj9/dvAniffrTz/2m76/dP9n/4P63v/0C1RQfPf7UU0+evwWwd+VOxpRFNMSZabtl0R2LKWOuriVW78bqSWk4yVdA4XU6YbIkpNIdIPdAWeZ7+/ocBK5ZxVHT1pUrVy6++PATT1x49sIfR9Cfv9zfP7r2yLe/XbtXG3jomRfOn//0iYdv3rz54sVcNpdF1ZowEbbLqeQsDMOWJOARp8CknKlI2lVhuFzV0ZPJyR66KDbenbI7pwbtfpnMqIbufPLdd7euf3ThUYDH7kf+/Pnu/t1fHvn2659+L597/KnLTz70LHDy3i2MvP6NNYuMjstGCWPc7IorxfrMG10XHYu5YlSY/4cG841po47D+HG9E1oo0KPHOVbWa2vHtrYoZ1cL7dk/i3S1Zf2zpV5XuIaFWWCEhMboFKlQgy4GR9oMlk4lJQUdNg2Rzb+DJSaiKGJZZnQhLGwmakIk0YzgK43fW+K96Mv73PM83+d79yuC46Oui4Ov4+fe7r08aLVaKY/ZIzZ5hFX/3Jd96VQqFdsJ0BhJP7r819b48l/LG+W7xcmtv2+SOh1G+v2y2NmlpfX19NWrtfXixlKrydDtdEAadkfHdb1B393MNncjGYZ3Ua5zLnPvZcp1+RVPz2WzK5xhlC0nTlzKL72VWpRpCwnMSJPvFn/f3HrwYPmWXL67tfngJorRNAiMLcVsizLb0tWjp47WeYlDIUeUMHTb9d2Q/3VYXt3NXi8y8+mVDDOjqex89qJG1FvpabD2DjElT0mPHt1/J2Vbyi9qI5GAzmikfyoub8SLD36fjMvlc1vLAoSk6VjMlloEx+ATbETVFo1WtYaCQQPBGfT2jo57do6zdk9bEJwynxvVUJ9qBimP/ZNnPWaKp0zPSFtb9q+lF22FdMoWQTFdl5v+tlicOrJbPLIhL5Orx7/4AKVpiOVh8us7SzurL58dGZk/+05WyhkAc/LesN5ZYXeYfuxM5qAnYaV5CKfODc1Qva4fzQ2mSmmFVCo90NeXltkiibxWgLS7jd8Vt6bkxeLAuFwuV0/99iKKKYx0IKKVyVLppYJtafXl2v6q0Hy2/55er7effB9+r3cnX6ns5HIILqKGZnqtVPOFyxrrhbc9BGeyO021qoO1H76VAqtighIM/HqvOK6WF3fVW3GAjM29i2JGI4ZC8rLYrE0rW9859WLbTdX5tnmD3iC12+2Gjk5rs9XZ/EoyicyE8SszzJDGpTn36Qx18VXKYyXqDzrbDr38/FvrAAjEIgUU1SmM78nH1UeK8qkxgMSn5l5AhTLCeGltMWAA5GwbfA9ns47z54eHO5xWO1FpfbaZo5JEM8KLfEMMbh7kD0wM4i5PRUVDg/PaU4YDVZ/nYX5taCARoVEUbQfIlHp3V65Wg13lU3OvkUa3gqQFSAHUpPKnTp09O/9C9p0XgnrpyWBQX2m3VnSw3L3u6QzC4zyvwX24hhLjvS6R024iOnvYp1oO311Pr8siKJooYKggZTM+DhrK1YIS+dgXr9FuN0aDkkIBLIP52jlV2zbyzgc354PDQb1Tn9UbTuo7DHpv0jKNMJQGd/X09EAclMt87U1zM+Gsb2g5/OH30DAIhQwEMDSC6bDN+BPqsrL4FKAglIE/jMCg/SRaiAi1X4/FTq0ejar6g8HgS8Gs81qFw+AkOIKT5KZzCONjcCXP2M3UBGPydJorLrga8JaWQ4cvpbRpmRakBDAsgpLYpno8XrZbPr4Rl5eVqQdWIHYSihIJkAIkJkuNnIBTU1twPgtKCOt1R0OHk7Jy08dyAHFp+DBuxl24+NqFH03EhFnMtByUStbuLPohUZgvnU5BCpB4Ody9XD2uhqLEAaIQ2kgGUFKIRICsqhwhRygU0g9DFfVtQU5KcFxzstGCMHwpw2RwivLgLleltcHTA8mblL676UWtLQ9SSB2GPYSo1cehhrc3NgTIFytGYChoVIcJkHWbLLW0Cm2cVwHkZLb/ZIfD0MGBXdO5XCMiEvdQVCbDizW4xkU5r33SzZkqpJJ9dwFQeAiB0GkSxTbHAYKoNzaENsYnFxRQeQUWCJACJA9idvYfa82qVPP9Hef1Uj2UxaCXshyb81oQimGmcT4XHsKZixqJuaPC02B2Ur61S/nFRW3aFoE1D02E8doafwTcmtu+D6GAXe0KNIIaaR1Kk0JVtLLUal1Nq0qV7R/O9hv0wX4xUcmKuonmZNKC8JnqMP5DdRLHJ6hpymQ2TVhNImX1vnQ+pfUnYloSo4XNhbZvPR0vK0OeuH97ShivuXYIQwGvRlroYywC784RgLTWOvrrQw6DQRoNmaTRplwuyULjv2KSOMPnNAwvFlOueqmzouZQiUj5Jbyu/GRi1ibslHaArGyMlwNkbPv+2P8QzK3QoaQgZDbil9mqztTU1YWC8yrQ4Qi1OQzRqJezNLGsBRkc9IhEvDKsuXLY19MpxnsaKpW8WLmWT+W1fvSzAqrAsPZfjZGVsUegJkfG9rbVADk+0F6IKABCCpBYQutPxarOeOv62vqz2RBhCIWi99gDLOGVsBZLM/KvxzN4hQ9Xh8OEiBl9vfrDHoopaRGt5W0wXuhsDDWC8W6A/DL1CEBu7+2NwSzLAYIa3e0YDXbZZlHIfqmu7oTwR0mtNHveYSAIluCiFouXtUjqEaX4oEe5T3Slel9O5DtWKlaWKJWMr+TrtA22ij8xW1AABEJZGZsqFyDbe5CJAAnAl4QONjHpnw3QfmG4Hj/R6gxFVQSUJVoT5VggsF5vNMoifzItjJIrFZX68NFcY2mFSAknUabk6zuyfD7lRxMJhRFCAYhaXXYcOTJ2f+82QJDJdhQDiMJtpAsAIbW21b7n9u9nHdBGCJ4T7t+UlEjqJWy0FRnsMTOj35zO8NUlFvx0LgyeZapL9j12aTEPfpGJhO4hBH3jiUfKQMmtbbBLXo7MtaMK9wLmdrvpRIIkIwA5U7dfUuflvBIvwcLzWxq5psbG1qi3SYKIGkwNYj7D88pROKg9WVriOx328V/fuLtYuLNuI9GFQFeXTicoOQ5KIJPtsXIBsoAqumATGxWBBLil9ceunulTRSVNLEd0c1D1+lyp1wJXY2NjKfInZZ7o8TAininhw5mPfGLf6Wo4ud9Yk2ln87IIVFrRpQi4FSsDIAQZ2P5ne7IcQZBJgLTDq8xoTLymI7VaWX71cFVNTU2rlwWnWJMwuvWSRom3qUlSily4OPHxhJkwi0XKTObYR77SJ6vDvup9N26sywppGdwpAW7puowAgevnW//s7amPy8tuBbCuBUyhcOv+WKBJmXZ9qepMXU1TjbeVhadvkkgsJgvrrav3llqaGv/jwHpik4bDaJk4WDk4g3/SUsemXTIgaa1wUIgSU8VVCbMXm40GlTQjUQ5unoxIEyNxkIEZQcI8eFBJRDYTe5Aa9DC2EEyGkRDnFh0HMxJnOHhYdlvih03ogcvr+977Xvt7SCSVejpz+tLpE5cuHX31hQ28658wa7ogmY/+hBtAZJvNxoucmO2CGBq77bfPD+mRX7wNvEVwQIQY7IZXyCcIuGfIi1JnUOaoy8W8pLxeisKZII4j8VYkOZP6e/nK0dj+WGye7u+3a3QMTZZX5/xq3i9pFR5iUOa0P0dhXL2N3d1d8LJ1jScABF5cHcUxCMEV+iQEcd+RI0eGPM6zLhas6/KcBUKUiWFZJNlqzdyKQOtxO3X7wTjUG+x4v52GY+1m0W1LLxISKE9o4fd5dBRBphp7f/48mUamLvgdEvzJARHuMMjufo0JQdrk8RivH4P1Aym8Hg8KowpSLIwrksslHyffJ5Mzlx/aNWPYNolN0MBkc32O2AAQUeYkiRcdF3oRRH8u+2uveRwxgIM5UEoSOzaO64EUzoeDNI7jJhxHqQHm4UPUfI0xMsdQnJkPskglF2+13gPGzf0TdnsVwyYARQNMlta/LsohLSd2JNi6GlebhXlZs43GrEGPNAa1nAzbKMpSm7gKRD7RLEvjB2BVRsBcRqd5wMxCH4eijE43hvyOb7WSM8nUiVPz4/MUjWF2FhohAStX14tuPtTTlmS5DTdOnAUmO1ONxslRvXVvUCtFCZilLHE9kJDuSZqmBV/4wAGjN9j1FqiBukwowzA6igWQOFCJpO6vlEoPdOMkoJB26KDK1Uzabcu721xNromSLBENcNdOttHs1QMIfKRGJYLowA0Uic5t0lg47DOZfB78OgWh5WQYIMEAWAyW8cXC1lYrF6kkU6VXUGtgVZIkgQpZrmcyX/2dkE2So/DAco3YQ2BPprIFAyy+yIsdWUsoqsj5h4c3rn4KY3AGNgnGoSEPzhiDThdjHvGgb8w6iqECSByuXC7+rXLrypXtbQxSi6QtglCtVpcyxWGlyIu1PAxfUcQ1q1X/PdsoIGAyQhRVrdaRVyXC7U7zBxNL9fCdMBv0GeFiUA3j6dvHDDhHnAz1MkYhj3K5HMBElleelUr7Ndi7LgtauLdEVtcTcz3pRaWjKpwoyrCNO13hm9MGa0F0dFStg1A3HOAsdTGU2AzXl27cuTN0dwiKPRbSF3LSO4DqxnSBAPWfCWBUoOReKT0bM5FljBTqNGkK11fTabca6oGCVlQUuVaw6hFDtpk1GKxTNW4DpmWb7EDGp/Oh4urq5j0fLVhYk8l5EULRbKb6+vqO7WPMFBWIIS/+g3xbWVm+XXkwUSZ1pqowYhGwKgbH60QxOhni1Q1ZUWprs1ZQvNB8bpietrbFD7zNFv0QdbuLajSdTqzes9CQK+GgxWO5ex4UcbogHVGXOQaeRRYWFkCT5eVvlcitAPiwbtewFrzbDdXrmXR6eHIyqn7gCaX2o2BFkN7m7D+SzDW0aSiK45FuDWmdHQqVOZewKl1bUeNji0StD6gIWoVZHT6YhZmIUKXthhqEboUimhU2NIjPVqrRbDo/zA9LFNRaq1OjQ8ZoUyxSRSxTCzLQgQw8mSf5khDuj/85Nzc5/2szhl2umWEFfpFZvygO5btFIc+ywUDACUosnJ+mV9DEThPR4KDw9aBkxSoklUrFyhPl8kTm+ciVNbV1c5YufazP45VA4Xl+kyzIP5RIVffZS8CAL2PBCOlyfxhVlq2NSJosD/FD4FnwP1k1aFnZFrQ4BwkTzN4GDCNMWDtmajdht5BKJXa+HIuBmstXX8FX6+WaOueCbfVtL5emi8K9/AWoy7Aid49+DsOLjphJMhwOG83xL/LiKk0Z2nTwgsgKm1iWFbigBIY+mJ04Sq9AiUX4etxAGDDM0I6BkmQydv9+qhy7fPV9Lbjy9Ls5tGcv5QwW02peVFk+H2HFqgdxlxlxIW6StIEU48DMkUPDrBgRI7xYPJpXBZblnAEuUF8P2YKFhECbcZwgDAaiGjMYkGQyNcuBGfZidW1tHaxxFOagBtMBSSqqebA9eFEc/QxDQ7JcJEBmKR++VnUrP2RNkgX+gsBrqqpybW2BIGw/OGgCRWkcNcAMNuhRjaT6k8l+nXT5z8jCYzUe796Td691nbY70+mixAqsoF3//DGEdEBBYGwy7ja63Uazy0zODMuRiMyKAg8PSdFeH5Se4yyNzbB+0QS9mzbhuAHSBScCKmYxqYlzO0ZWH6hbAI1SK+w1+TnOJ0kqOx3K2WxmsxnZDgwjKHHrouBGx8DrUVHUNJ5VoxCq9MTCWRiv5+nefXYaXYWbCLQdpSFZGzEk2a8HQMqZkSXr59Sfbu2BhmxuU1s6rRYlfjobisdJUk+W2WZ0zzLgAniQtNANTeZ1jKCqrCQxTK/Vus3i8VJ2B1SGWIWaaKqJOoYh/f2ppA6JlV9kqq/sPrz14cMN0LxuqXuZVlmh81suFBqIFxIJN2l069nSC99h7EsUCok+W/i7rMkaz7OzISm+Zq/H6rFSvVt2OVAUCo877NCPIpVkSldSicWen6neX2OYd+7ZxT1dXTcb/c2KT5kecyGusVyoUJqcHI+T40Bxk26y9HuykIj3QWP/ndEUoPBy1BfkOOaJ1ep1dHrs/pM1KEGbCMzRUgOQ1H2oiA4pZ15U41jLOhDS09N6CncGBVX6lkO2u8amwBwqlCDGS+PjpUk4PmWzY2EbtKq5Lxqv/VR4tpNp9vn8bxth/7HBgVOUfxftaMApO/w7NiHA+C+lPDFye8ktbO7muW96jvec9tNOSfFn9aUkQWan/mb7CjC+DiglwPb6NZXN2Yw2s+sbw2qsIgV9fuYkWDi+aOPhww0Y7QCndBtFOQAyvwVJ6iWpVACSybxfsnD1vIeb79zZ03r2LtiXxUEd0mGzkWQWKCQJiEICGFOPBuIJss9s3I5ke6MMo0pSlOH8nRxzzXut64TXjm00DHb6rXaqqWbL8pZ/NZphaGplGMdPDG+te7ZiH9r50qI70G0sWIOmRckm5+C4MoI5R30oBNmHNdTZOZ5jaGC1To6T3uCg7V617HbZaTLciXU25od7RHcLhe5l3qaQQ2s6d2OQi0Z3Ldp6TtHjOaIg78//8zzv874+r0qBBAZcC7t/3lV1B7ZDAT5k9gsT/f3fvkPcKYMSCPiPQ/d+r125fOUzQFy7d2fos2tgl2ywSDbJmffmZ7a+emfr9tQUMTUD57VrsgM89lKPtY/o0KoNhi5ktfFfSH59AEH5RTUXykgbfMEsCjRJWvsPywMD+qH7P94f0p/c+xh+w1+79tnKiV6vhxS22S7Zxi83ua0pwjr1XpJMWsnozNrt22trdCJh0arVcHWo1SMGAyj5/JOboGMBlGzeCn0Q9HkDwSDGyALNWfshuy4B5D7MD/3JncjKFWhCARXe2cBgu+c65KasVoKYss5zSZKUo9FowmExWDTaPpLknobGmQUgmdWbUFN+/bfg3zDmcr4gbPcKOEtRouX7/tPDMpQR5YLvfhK5//NORGHAO5ABkCvpFmmdmrf209y8laZlwUJR/kQi4XFMcOq+Ptkx1mF4DiAgZfXzm4s3FxYePFyam9v27i6d2XGMlQUrIXNHrsu2yzYwvTIVISR6YCimt11ShBwVyS0iSVjnKfFCFKy04IGqBGnjsWjBptWesTGANFZByueri4q/tndzn875lhrLBSeGyaxfniBP02VlsVL8b0MGoAKDDf1XJMfHxwdACPfCPMmRVgKL1UlhDUoS1AszCt2Tp7XqsY4u6LGNIQ2gNFYbAHmwu5vdngv5MpnseaFQZAWWpeX6IWQrtDtsiDLmlWsDkFKXlbAr3tLrm1aOILa2OEqM4ZilXytQMJHNo6NXjSZNe7t2eqTLAp0DZKPRyABkQVGyu+vzBr2+LECceIxmWVYmxXgE0ngcGI+NlyMfIzblhU0x2OPtXdRFUiTnORErsiynfYGaZBh+tPObn64CBM5O1RbLtOc5JNPIZEANCFl48GDJp5Oyfy5JmQLuxATq4ICiBGczveLSK4Mj5UgZxlYMQfSXkAHXYZGUaY6VRfAVR7NUlGInzev8Rz8EOnWzIEXdMW0YG/MgGcUUyMJNWOezYV/uTy/v40EKVmeEgwNMxOJNCIzehuhdaRekFTBsegSW4HSrpa3TrIjZCy2RThKCwFKC32zmRwOBq4YujVarMYxZxhwO5CyT2cg0FhdXgbO7lPNmf8tJks9dcB5gWpllD/y4vxWP34mcuE4i6fSOawi5bHsb0uDHlXL8AisWRRxP4fbTOs3RMxTFTA6aJSgZXuPLXSOa2ZGOMaLLsYZkzvhMo1pdqsIOL1v1eXO5XCAg8RL4ixVEjLEf4H4RO4xXmvuRyE66XCvfH1gZiKy4SkfOIoYVcHveHcOK1nlGkEXGgK57e71hCVpnGo1GO93heBaigph5PrOxEQhWVxd3fcGsN5sN5SQ+WCg4CwyHsZj/4MBux+tYLKZc8aNKJd/cT1cqlRYGCZVKud2lFi7SXJKmorTADJrNowFUB6cjs0CZ7ph2PDs9oUBGR6VwAJQAwzfny+ay0vl5we8sOEWWZQpOv92d+hvH8BjcGBg8tZxYDMNL7pQ7FcdwcJkVahBBW4r+N9B1KdypM8wCY7YD9iWOiWkHYjYzvLQePqsuLYWDPigqWV/ouH4uOWO8hB3gnosiHnNX3PFUPv83DuNDBPA4IFL5UiqfKrkhMU6tgoWkpmgKvMW/rDMavaYRYLRr1ACZniBACWMeXR/8SKouhtBANRT05uY2T2Pn4RbP8347xtRxrHjgdldK+XQln4+XSuCfPfc+WMTddIMHL4rMqyT7hEwzfhQ1+zvDnZ1GowmUzLZ3EGqOcxAkIpgHzSgqrQer0BIOhEJeb/bY6TwOHRcKBdbPshQr4AVnCYLhSkf2Ins7f+xFajv7e5H9Wjzewi+KxboogrdkGSBAWb8aftloMqnaYZuihpNEYoIgkEl0chDtXJeCErre2RnS+ea2jefHwKryvB3HcZbFIOKxo2bFVXat1FZqOzVXrba/l95rHrqLuHhab6cpUpYTTJFB16/CwYKq22SabWvTal6Bo2r1UxxHIFSUYRhUCgRQs/kNVAoFje8aex8eb3vDGwEct4PhOISgcNS8U0una+VaJO3a29uvNA/jpzFMFE9BCksQFGowoC+HwzrdW8bu7tlu2Dtq2rXchJokAUJ7PDS4S1pnJoVBNByWjqHdva06vsj5AnZ+GSCAKcRKR0fNZtp1L11Jp9N7laP8Yct5Ua/j9WKdltlJVJhEdSYdnIyBEmgwmgAyotWSBEEokGhClsGV0JVmBAaF7zNiNKpUpuG/vAF+Y2N5GTg4GFaKV+IVmCXgtyYw4q14K3YRi4lFkfKzftSJ8uGQoVf1Yu8XqvdVbbPDj0LhUlthmbcmrUi0a/q2JzHICAkLnUAtI5Yuo2rYpHp415STspnl5WU7D1rckKz5eL4E+XUUz+f34TWkmZOJYX7MDlkYCIQ7vaHwrd7uL1Svtb/f1t72aJva8nRPsi/Z82oScTjW1jwO8BkdjXosFlBiUkHnfvPG8LZ3m8+cKRiwFMxteKTy7jxk8X7erUx1N++225cVtRu+oC7kfevWux9+qWprG36zrf2ZHm0HbFi0Pcnkq8jE2gwtOzy36SgV9chEAiDdw93Dqu2Hx5tStZGxL8MwcCscOwys0MAU5vK/drZ8luGrPshKb++tD69/ePf6h23XX+p5RtPToX2pJ9kDN/LCE1tPkU89BRtf5Q9UVrKv5/nHn3n8+ut3b7x53bTpDYfCqGLOf+3ceV5Xnv83NHyuC4fhM97NzVt3byhdzF++/u516JQ+Cn+xeTT56iNPwuORfwDmIxlqcXq9nQAAAABJRU5ErkJggg==";
});
canvas {width: 100px; height: 100px}
div {text-align:center; width:120px; float:left}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<div>
<canvas id="canvOrigMinus100" width="100" height="100"></canvas>
-98%
</div>
<div>
<canvas id="canvOrigMinus50" width="100" height="100"></canvas>
-50%
</div>
<div>
<canvas id="canvOrig" width="100" height="100"></canvas>
Original
</div>
<div>
<canvas id="canvOrigPlus50" width="100" height="100"></canvas>
+50%
</div>
<div>
<canvas id="canvOrigPlus100" width="100" height="100"></canvas>
+98%
</div>
<hr/>
<div style="clear:left">
<canvas id="canvRoundMinus90" width="100" height="100"></canvas>
Round-trip <br/> (-98%, +98%)
</div>
<div>
<canvas id="canvRoundMinus50" width="100" height="100"></canvas>
Round-trip <br/> (-50%, +50%)
</div>
<div>
<canvas id="canvRound0" width="100" height="100"></canvas>
Round-trip <br/> (0% 100x)
</div>
<div>
<canvas id="canvRoundPlus50" width="100" height="100"></canvas>
Round-trip <br/> (+50%, -50%)
</div>
<div>
<canvas id="canvRoundPlus90" width="100" height="100"></canvas>
Round-trip <br/> (+98%, -98%)
</div>
Explanation
(Disclaimer - I am not an image specialist or a mathematician. I am trying to provide a common-sense explanation with minimal technical details. Some hand-waving below, e.g. 255=256 to avoid indexing issues, and 127.5=128, for simplifying the numbers.)
Since, for a given pixel, the possible number of non-zero values for a color channel is 255, the "no-contrast", average value of a pixel is 128 (or 127, or 127.5 if you want argue, but the difference is negligible). For purposed of this explanation, the amount of "contrast" is the distance from the current value to the average value (128). Adjusting the contrast means increasing or decreasing the difference between the current value and the average value.
The problem the algorithm solves then is to:
Chose a constant factor to adjust contrast by
For each color channel of each pixel, scale "contrast" (distance from average) by that constant factor
Or, as hinted at in the CSS spec, simply choosing the slope and intercept of a line:
<feFuncR type="linear" slope="[amount]" intercept="-(0.5 * [amount]) + 0.5"/>
Note the term type='linear'; we are doing linear contrast adjustment in RGB color space, as opposed to a quadratic scaling function, luminence-based adjustment, or histogram matching.
If you recall from geometry class, the formula for a line is y=mx+b. y is the final value we are after, the slope m is the contrast (or factor), x is the initial pixel value, and b is the intercept of the y-axis (x=0), which shifts the line vertically. Recall also that since the y-intercept is not at the origin (0,0), the formula can also be represented as y=m(x-a)+b, where a is the x-offset shifting the line horizontally.
For our purposes, this graph represents the input value (x-axis) and the result (y-axis). We already know that b, the y-intercept (for m=0, no contrast) must be 128 (which we can check against the 0.5 from the spec - 0.5 * the full range of 256 = 128). x is our original value, so all we need is to figure out the slope m and x-offset a.
First, the slope m is "rise over run", or (y2-y1)/(x2-x1) - so we need 2 points known to be on the desired line. Finding these points requires bringing a few things together:
Our function takes the shape of a line-intercept graph
The y-intercept is at b = 128 - regardless of the slope (contrast).
The maximum expected 'y' value is 255, and the minimum is 0
The range of possible 'x' values is 256
A neutral value should always stay neutral: 128 => 128 regardless of slope
A contrast adjustment of 0 should result in no change between input and output; that is, a 1:1 slope.
Taking all these together, we can deduce that regardless of the contrast (slope) applied, our resulting line will be centered at (and pivot around) 128,128. Since our y-intercept is non-zero, the x-intercept is also non-zero; we know the x-range is 256 wide and is centered in the middle, so it must be offset by half of the possible range: 256 / 2 = 128.
So now for y=m(x-a)+b, we know everything except m. Recall two more important points from geometry class:
Lines have the same slope even if their location changes; that is, m stays the same regardless of the values of a and b.
The slope of a line can be found using any 2 points on the line
To simplify the slope discussion, let's move the coordinate origin to the x-intercept (-128) and ignore a and b for a moment. Our original line will now pivot through (0,0), and we know a second point on the line lies away the full range of both x (input) and y (output) at (255,255).
We'll let the new line pivot at (0,0), so we can use that as one of the points on the new line that will follow our final contrast slope m. The second point can be determined by moving the current end at (255,255) by some amount; since we are limited to a single input (contrast) and using a linear function, this second point will be moved equally in the x and y directions on our graph.
The (x,y) coordinates of the 4 possible new points will be 255 +/- contrast. Since increasing or decreasing both x and y would keep us on the original 1:1 line, let's just look at +x, -y and -x, +y as shown.
The steeper line (-x, +y) is associated with a positive contrast adjustment; it's (x,y) coordinates are (255 - contrast,255 + contrast). The coordinates of the shallower line (negative contrast) are found the same way. Notice that the biggest meaningful value of contrast will be 255 - the most that the initial point of (255,255) can be translated before resulting in a vertical line (full contrast, all black or white) or a horizontal line (no contrast, all gray).
So now we have the coordinates of two points on our new line - (0,0) and (255 - contrast,255 + contrast). We plug this into the slope equation, and then plug that into the full line equation, using all the parts from before:
y = m(x-a) + b
m = (y2-y1)/(x2-x1) =>
((255 + contrast) - 0)/((255 - contrast) - 0) =>
(255 + contrast)/(255 - contrast)
a = 128
b = 128
y = (255 + contrast)/(255 - contrast) * (x - 128) + 128 QED
The math-minded will notice that the resulting m or factor is a scalar (unitless) value; you can use any range you want for contrast as long as it matches the constant (255) in the factor calculation. For example, a contrast range of +/-100 and factor = (100 + contrast)/(100.01 - contrast), which is was I really use to eliminate the step of scaling to 255; I just left 255 in the code at the top to simplify the explanation.
Note about the "magic" 259
The source article uses a "magic" 259, although the author admits he doesn't remember why:
"I can’t remember if I had calculated this myself or if I’ve read it in a book or online.".
259 should really be 255 or perhaps 256 - the number of possible non-zero values for each channel of each pixel. Note that in the original factor calculation, 259/255 cancels out - technically 1.01, but final values are whole integers so 1 for all practical purposes. So this outer term can be discarded. Actually using 255 for the constant in the denominator, though, introduces the possibility of a Divide By Zero error in the formula; adjusting to a slightly larger value (say, 259) avoids this issue without introducing significant error to the results. I chose to use 255.01 instead as the error is lower and it (hopefully) seems less "magic" to a newcomer.
As far as I can tell though, it doesn't make much difference which you use - you get identical values except for minor, symmetric differences in a narrow band of low contrast values with a low positive contrast increase. I'd be curious to round-trip both versions repeatedly and compare to the original data, but this answer already took way too long. :)
After trying the answer by Schahriar SaffarShargh, it wasn't behaving like contrast should behave. I finally came across this algorithm, and it works like a charm!
For additional information about the algorithm, read this article and it's comments section.
function contrastImage(imageData, contrast) {
var data = imageData.data;
var factor = (259 * (contrast + 255)) / (255 * (259 - contrast));
for(var i=0;i<data.length;i+=4)
{
data[i] = factor * (data[i] - 128) + 128;
data[i+1] = factor * (data[i+1] - 128) + 128;
data[i+2] = factor * (data[i+2] - 128) + 128;
}
return imageData;
}
Usage:
var newImageData = contrastImage(imageData, 30);
Hopefully this will be a time-saver for someone. Cheers!
This javascript implementation complies with the SVG/CSS3 definition of "contrast" (and the following code will render your canvas image identically):
/*contrast filter function*/
//See definition at https://drafts.fxtf.org/filters/#contrastEquivalent
//pixels come from your getImageData() function call on your canvas image
contrast = function(pixels, value){
var d = pixels.data;
var intercept = 255*(-value/2 + 0.5);
for(var i=0;i<d.length;i+=4){
d[i] = d[i]*value + intercept;
d[i+1] = d[i+1]*value + intercept;
d[i+2] = d[i+2]*value + intercept;
//implement clamping in a separate function if using in production
if(d[i] > 255) d[i] = 255;
if(d[i+1] > 255) d[i+1] = 255;
if(d[i+2] > 255) d[i+2] = 255;
if(d[i] < 0) d[i] = 0;
if(d[i+1] < 0) d[i+1] = 0;
if(d[i+2] < 0) d[i+2] = 0;
}
return pixels;
}
I found out that you have to use the effect by separating the darks and lights or technically anything that is less than 127 (average of R+G+B / 3) in rgb scale is a black and more than 127 is a white, therefore by your level of contrast you minus a value say 10 contrast from the blacks and add the same value to the whites!
Here is an example:
I have two pixels with RGB colors, [105,40,200] | [255,200,150]
So I know that for my first pixel 105 + 40 + 200 = 345, 345/3 = 115
and 115 is less than my half of 255 which is 127 so I consider the pixel closer to [0,0,0] therefore if I want to minus 10 contrast then I take away 10 from each color on it's average
Thus I have to divide each color's value by the total's average which was 115 for this case and times it by my contrast and minus out the final value from that specific color:
For example I'll take 105 (red) from my pixel, so I divide it by total RGB's avg. which is 115 and times it by my contrast value of 10, (105/115)*10 which gives you something around 9 (you have to round it up!) and then take that 9 away from 105 so that color becomes 96 so my red after having a 10 contrast on a dark pixel.
So if I go on my pixel's values become [96,37,183]! (note: the scale of contrast is up to you! but my in the end you should convert it to some scale like from 1 to 255)
For the lighter pixels I also do the same except instead of subtracting the contrast value I add it! and if you reach the limit of 255 or 0 then you stop your addition and subtraction for that specific color! therefore my second pixel which is a lighter pixel becomes [255,210,157]
As you add more contrast it will lighten the lighter colors and darken the darker and therefore adds contrast to your picture!
Here is a sample Javascript code ( I haven't tried it yet ) :
var data = imageData.data;
for (var i = 0; i < data.length; i += 4) {
var contrast = 10;
var average = Math.round( ( data[i] + data[i+1] + data[i+2] ) / 3 );
if (average > 127){
data[i] += ( data[i]/average ) * contrast;
data[i+1] += ( data[i+1]/average ) * contrast;
data[i+2] += ( data[i+2]/average ) * contrast;
}else{
data[i] -= ( data[i]/average ) * contrast;
data[i+1] -= ( data[i+1]/average ) * contrast;
data[i+2] -= ( data[i+2]/average ) * contrast;
}
}
You can take a look at the OpenCV docs to see how you could accomplish this: Brightness and contrast adjustments.
Then there's the demo code:
double alpha; // Simple contrast control: value [1.0-3.0]
int beta; // Simple brightness control: value [0-100]
for( int y = 0; y < image.rows; y++ )
{
for( int x = 0; x < image.cols; x++ )
{
for( int c = 0; c < 3; c++ )
{
new_image.at<Vec3b>(y,x)[c] = saturate_cast<uchar>( alpha*( image.at<Vec3b>(y,x)[c] ) + beta );
}
}
}
which I imagine you are capable of translating to javascript.
By vintaging I assume your trying to apply LUTS..Recently I have been trying to add color treatments to canvas windows. If you want to actually apply "LUTS" to the canvas window I believe you need to actually map the array that imageData returns to the RGB array of the LUT.
(From Light illusion)
As an example the start of a 1D LUT could look something like this:
Note: strictly speaking this is 3x 1D LUTs, as each colour (R,G,B) is a 1D LUT
R, G, B
3, 0, 0
5, 2, 1
7, 5, 3
9, 9, 9
Which means that:
For an input value of 0 for R, G, and B, the output is R=3, G=0, B=0
For an input value of 1 for R, G, and B, the output is R=5, G=2, B=1
For an input value of 2 for R, G, and B, the output is R=7, G=5, B=3
For an input value of 3 for R, G, and B, the output is R=9, G=9, B=9
Which is a weird LUT, but you see that for a given value of R, G, or B input, there is a given value of R, G, and B output.
So, if a pixel had an input value of 3, 1, 0 for RGB, the output pixel would be 9, 2, 0.
During this I also realized after playing with imageData that it returns a Uint8Array and that the values in that array are decimal. Most 3D LUTS are Hex. So you first have to do some type of hex to dec conversion on the entire array before all this mapping.
This is the formula you are looking for ...
var data = imageData.data;
if (contrast > 0) {
for(var i = 0; i < data.length; i += 4) {
data[i] += (255 - data[i]) * contrast / 255; // red
data[i + 1] += (255 - data[i + 1]) * contrast / 255; // green
data[i + 2] += (255 - data[i + 2]) * contrast / 255; // blue
}
} else if (contrast < 0) {
for (var i = 0; i < data.length; i += 4) {
data[i] += data[i] * (contrast) / 255; // red
data[i + 1] += data[i + 1] * (contrast) / 255; // green
data[i + 2] += data[i + 2] * (contrast) / 255; // blue
}
}
Hope it helps!
I'm trying to replace a color and colors near it in a bitmap.
threshold() seems to work but it seems to be that you have to specify the exact color "==" or all colors before or after the exact color "<" & ">" plus "<=" and ">=". I am hoping that the mask parameter will help me find a way to find a color and a dynamic range of colors before and after it to be replaced. What is its intended usage?
Per the comment below Example 1 and 2:
bit.threshold(bit, bit.rect, point, ">", 0xff000000, 0xffff0000, 0x00FF0000);
bit.threshold(bit, bit.rect, point, ">", 0xff000000, 0xffff0000, 0x00EE0000);
If you're trying to do a flood fill, I don't think the mask parameter will help you. The mask parameter lets you ignore parts of the color in the test. In your case, you want to take into account all the channels of the color, you just want the matching to be fuzzy.
e.g. If you want to replace all pixels where the red component is 0, you can set mask to 0x00FF0000, so it will ignore the other channels.
The implementation pseudo-code probably looks something like this:
input = readPixel()
value = input & mask
if(value operation threshold)
{
writePixel(color)
}
Neither of your samples will produce anything because the mask limits the values to be between 0x00000000 and 0x00FF0000, then tests if they're greater than 0xFF000000.
I have also done this and eventually, I have found it best to create my own threshold-method. You can find it below. Everything is explained in comment.
//_snapshot is a bitmapData-object
for(var i:int = 0; i <= _snapshot.width; i++)
{
for(var j:int = 0; j <= _snapshot.height; j++)
{
//We get the color of the current pixel.
var _color:uint = _snapshot.getPixel(i, j);
//If the color of the selected pixel is between certain values set by the user,
//set the filtered pixel data to green.
//Threshold is a number (can be quite high, up to 50000) to look for adjacent colors in the colorspace.
//_colorToCompare is the color you want to look for.
if((_colorToCompare - (100 * _threshold)) <= _color && _color <= (_colorToCompare + (100 * _threshold)))
{
//This sets the pixel value.
_snapshot.setPixel(i, j, 0x00ff00);
}
else
{
//If the pixel color is not within the desired range, set it's value to black.
_snapshot.setPixel(i, j, 0x000000);
}
}
}