Historia berria

This Simple App Lets You See How Hollywood Uses Color to Mess With Your Emotions

arabera Nailya Safarova8m2025/04/14
Read on Terminal Reader

Luzeegia; Irakurri

Learn how color grading works, what color targets are, and build your own palette-based grading tool in React — from scratch and with purpose.
featured image - This Simple App Lets You See How Hollywood Uses Color to Mess With Your Emotions
Nailya Safarova HackerNoon profile picture


Why does Joker feel so unsettlingly green and yellow?

Why does Joker feel so unsettlingly green and yellow?

Why does Joker feel so unsettlingly green and yellow?Joker

Why is Blade Runner soaked in teal and orange?

Why is Blade Runner soaked in teal and orange?

Why is Blade Runner soaked in teal and orange?Blade Runner

Why does film make skin look soft and warm?

Why does film make skin look soft and warm?

Why does film make skin look soft and warm?


It’s not magic. It’s color.

It’s not magic. It’s color.


And whoever controls color — controls the emotional weight of the frame. Color grading is cinema’s visual language — a way to shape atmosphere, guide the viewer’s attention, and set the tone of a story. It can be subtle and almost invisible. Or bold and stylized, like in the work of Wong Kar-Wai, Fincher, Nolan, or Villeneuve. But to speak this language fluently, you first need to understand its grammar.


Where does color work actually begin? How do we know what the “right” color even is? And how do you translate creative intent into technical action?


Spoiler: it doesn’t start with LUTs or filters. It starts with color targets, calibration, and carefully chosen palettes.

It starts with color targets, calibration, and carefully chosen palettes


In this series, we’ll explore:

  • what color grading is on the most fundamental level;
  • why limiting your palette is a feature, not a bug;
  • how color charts work (and why they matter);
  • and how to build your own grading tool in the browser — with React and some simple math.
  • what color grading is on the most fundamental level;
  • why limiting your palette is a feature, not a bug;
  • how color charts work (and why they matter);
  • and how to build your own grading tool in the browser — with React and some simple math.

  • We won’t just explain it. We’ll show it. And you’ll be able to experiment, tinker, and maybe for the first time actually see what color grading is made of.

    We won’t just explain it. We’ll show it.see

    What is Color Grading, and Why Start with Color Targets?

    What is Color Grading, and Why Start with Color Targets?

    Color grading is the process of adjusting and stylizing an image to achieve a specific visual tone, emotional effect, or technical standard. Whether in film, photography, or digital production, grading usually includes:


    • balancing exposure and white point;
    • correcting technical color shifts;
    • crafting a visual style with contrast, tones, and color curves.
  • balancing exposure and white point;
  • correcting technical color shifts;
  • crafting a visual style with contrast, tones, and color curves.

  • But grading isn’t just about “making it pretty.” It’s about control — of the mood, of the viewer’s focus, of the visual language of the entire narrative.

    X-Rite ColorChecker Classic

    X-Rite ColorChecker Classic


    Before you apply any creative look, you need to bring the image to a neutral state, which means:

    neutral state


    • correcting unwanted color casts (color correction);

    • normalizing the footage to a standard color space (like Rec.709 or sRGB);

    • matching material from different cameras into a consistent baseline.


  • correcting unwanted color casts (color correction);

  • correcting unwanted color casts (color correction);

  • normalizing the footage to a standard color space (like Rec.709 or sRGB);

  • normalizing the footage to a standard color space (like Rec.709 or sRGB);

  • matching material from different cameras into a consistent baseline.


  • matching material from different cameras into a consistent baseline.


    And for that, colorists use color targets like the **ColorChecker, ChromaDuMonde,**or other reference charts.

    color targetsColorCheckerChromaDuMonde

    A color target is a chart of color patches with precisely measured values. These values aren’t arbitrary — they’re obtained through spectrophotometric measurements in controlled environments using professional equipment like X-Rite or Konica Minolta spectrophotometers.

    precisely measured valuesKonica Minolta spectrophotometers.Konica Minolta spectrophotometers


    One of the first widely adopted targets was the Kodak Gray Scale — a strip of neutral gray tones used for exposure control. Later came more advanced charts with full-color patches — like the Macbeth ColorChecker, introduced in 1976 (now known as the X-Rite ColorChecker). It features 24 color swatches designed to represent common real-world colors: human skin, blue sky, green foliage, and more.

    Kodak Gray ScaleKodak Gray ScaleMacbeth ColorCheckerX-Rite ColorChecker


    With the rise of digital photography and digital cinema, color targets became even more critical. They are now essential tools for calibrating not just cameras, but also monitors, printers, scanners — and any device that handles color. They’re used in color matching, profiling, and neutral balancing workflows — from film production to scientific imaging.


    The process of calibrating with color targets

    The process of calibrating with color targets


    Take X-Rite’s ColorChecker, for example. Each patch is measured under standardized lighting (usually D65 or D50), with results recorded in CIE XYZ coordinates — a device-independent color model. Those coordinates are then converted into RGB values, depending on your working color space (like sRGB, Rec.709, or AdobeRGB).

    D65D50CIE XYZ coordinatesRGB valuessRGBRec.709AdobeRGB


    So the RGB arrays we use in our app aren’t guesswork — they’re precise digital representations of standardized, physically measured patches.


    If the skin tone patch in the ColorChecker Classic is defined as [194, 150, 130] in RGB, that’s how it should look under correct conditions. If your footage shows something different, that’s a sign of a color cast — and a starting point for correction.


    The Catch: Color Charts Are Just the Beginning

    The Catch: Color Charts Are Just the Beginning

    Color targets are essential for calibration — but that’s all they are. A beginning. They don’t account for:

    A beginning.
    • how colors behave in highlights or shadows;
    • the unique characteristics of film stock or lenses;
    • or the creative intent behind a particular look.
  • how colors behave in highlights or shadows;
  • the unique characteristics of film stock or lenses;
  • or the creative intent behind a particular look.

  • In professional tools like DaVinci Resolve or Dehancer, color charts are just step one in a long pipeline. From there, you move into advanced processes like film emulation, tone mapping, grain, halation, bloom, and other stylistic transformations. So it’s critical to understand: a chart is a calibration tool — not a style.

    DaVinci ResolveDaVinci ResolveDehancerDehancera chart is a calibration tool — not a style.

    CinePalette: A Minimal Tool for Exploring Color Palettes

    CinePalette: A Minimal Tool for Exploring Color Palettes

    To show how choosing a palette affects an image, we built CinePalette — a simple web app that visualizes what happens when you restrict your color space (a process known as palette reduction).

    CinePalettepalette reduction


    What You Can Do with CinePalette:

    What You Can Do with CinePalette:
    • upload any image;
    • pick a palette (ColorChecker, Portra, Sepia, etc.);
    • remap every pixel to the closest color in that palette;
    • compare before & after with an interactive slider;
    • save the result;
    • or build your own palette from scratch.
  • upload any image;
  • pick a palette (ColorChecker, Portra, Sepia, etc.);
  • remap every pixel to the closest color in that palette;
  • compare before & after with an interactive slider;
  • save the result;
  • or build your own palette from scratch.

  • How It Works in Code

    How It Works in Code

    Main Menu

    Main Menu

    Our app runs entirely in the browser using React and the Canvas API. The project — called CinePalette — will be open-sourced and available on GitHub (link at the end of the series).

    ReactCanvas APICinePalette


    We start with a set of predefined palettes, but users can also build and save their own. Palettes are defined as arrays of RGB values — for example, here’s what the Kodak Portra 400 palette looks like:

    Kodak Portra 400
    "Portra 400": [
      [75, 60, 50],     // shadows
      [160, 130, 110],  // skin tones
      [220, 200, 180],  // highlights
      [60, 100, 80],    // foliage
      [180, 150, 100]   // neutral
    ],
    
    "Portra 400": [ [75, 60, 50], // shadows [160, 130, 110], // skin tones [220, 200, 180], // highlights [60, 100, 80], // foliage [180, 150, 100] // neutral ],


    The selected palette defines which colors are “allowed” to appear in the final image. These become the visual language of the frame — the base tones that set its mood and style.


    When a user uploads an image and chooses a palette, here’s what happens under the hood:

    1. The image is rendered to a hidden <canvas> — this gives us pixel-level access to manipulate the data.
    2. We extract the ImageData object, which contains an array where each pixel is represented by four values: [R, G, B, A].
    3. We loop through every pixel, extract its RGB color.
    4. For each pixel, we find the closest matching color from the selected palette, using Euclidean distance in RGB space — and replace it.
  • The image is rendered to a hidden <canvas> — this gives us pixel-level access to manipulate the data.
  • The image is rendered to a hidden <canvas>
  • We extract the ImageData object, which contains an array where each pixel is represented by four values: [R, G, B, A].
  • We extract the ImageData object
  • We loop through every pixel, extract its RGB color.
  • We loop through every pixel
  • For each pixel, we find the closest matching color from the selected palette, using Euclidean distance in RGB space — and replace it.
  • For each pixel, we find the closest matching color from the selected palette


    Let’s load up a Shirley card and try applying different palettes — you’ll see immediately how the palette choice shapes the image.

    We applied Teal & Orange, Sepia, and ColorChecker Classic palettes to a classic Shirley card

    We applied Teal & Orange, Sepia, and ColorChecker Classic palettes to a classic Shirley card


    The core of the magic lies in a function that analyzes each individual pixel and finds the closest matching color from the selected palette:

    each individual pixelclosest matching color
    const findClosestColor = (r, g, b) => {
      let minDist = Infinity;
      let closest = [r, g, b];
      for (let [pr, pg, pb] of palette) {
        const dist = Math.sqrt((r - pr) ** 2 + (g - pg) ** 2 + (b - pb) ** 2);
        if (dist < minDist) {
          minDist = dist;
          closest = [pr, pg, pb];
        }
      }
      return closest;
    };
    
    const findClosestColor = (r, g, b) => { let minDist = Infinity; let closest = [r, g, b]; for (let [pr, pg, pb] of palette) { const dist = Math.sqrt((r - pr) ** 2 + (g - pg) ** 2 + (b - pb) ** 2); if (dist < minDist) { minDist = dist; closest = [pr, pg, pb]; } } return closest; };


    Then, we replace the pixel’s original color in the ImageData with the closest match from the palette. And we repeat this — for every single pixel in the image.

    for every single pixel
    for (let i = 0; i < data.length; i += 4) {
      const [r, g, b] = [data[i], data[i + 1], data[i + 2]];
      const [nr, ng, nb] = findClosestColor(r, g, b);
      data[i] = nr;
      data[i + 1] = ng;
      data[i + 2] = nb;
    }
    
    for (let i = 0; i < data.length; i += 4) { const [r, g, b] = [data[i], data[i + 1], data[i + 2]]; const [nr, ng, nb] = findClosestColor(r, g, b); data[i] = nr; data[i + 1] = ng; data[i + 2] = nb; }


    Once all pixels have been processed, we render the result back onto the <canvas> and convert it to an image using .toDataURL(). This allows the user to see the result instantly in the browser — and download the filtered image with a single click.

    see the result instantly in the browser


    ctx.putImageData(imageData, 0, 0);
    setFilteredImage(canvas.toDataURL());
    
    ctx.putImageData(imageData, 0, 0); setFilteredImage(canvas.toDataURL());


    Here, we use Euclidean distance in RGB space — a classic method to measure how “close” two colors are:

    Euclidean distance in RGB space
    const dist = Math.sqrt((r - pr) ** 2 + (g - pg) ** 2 + (b - pb) ** 2);
    
    const dist = Math.sqrt((r - pr) ** 2 + (g - pg) ** 2 + (b - pb) ** 2);


    Here, (r, g, b) is the color of the current pixel, and (pr, pg, pb) is one of the colors in the palette. Out of all the distances calculated, we choose the smallest one — the closest visual match within the selected palette.

    closest visual match


    This approach is intuitive and easy to implement, but it has limitations: RGB space doesn’t account for how humans actually perceive color — for instance, we’re more sensitive to green than to blue, and brightness differences can be misleading.

    RGB space doesn’t account for how humans actually perceive color

    B&W palette

    B&W palette

    We use this approach in CinePalette as a simple and accessible way to demonstrate the basic principle of color mapping. However, even in its current form, you might notice that some colors get replaced in ways that feel unexpected or “off.”

    CinePalette


    In future versions, we plan to add a toggle between RGB and CIELAB color spaces — allowing users to compare how different models affect the accuracy of color matching.

    RGB and CIELAB

    Why Does This Matter?

    Why Does This Matter?

    CinePalette showcases a basic but fundamental step in color grading: palette restriction. This is where every visual style begins — with the question: “What if we only used these colors?”.

    palette restriction“What if we only used these colors?”.


    A Portra palette brings warm, nostalgic tones. Pro 400 feels cool and subdued. Teal & Orange delivers high-contrast cinematic punch. Unlike tools like Dehancer or Resolve, CinePalette doesn’t simulate the physics of film. But it captures the essence: color is a tool for style and storytelling.

    color is a tool for style and storytelling.


    Application interface

    Application interface

    What’s Next?

    What’s Next?

    This is just the beginning. In the next parts of the series:

    • we’ll expand CinePalette with the ability to pick a palette from a reference image;
    • add automatic extraction of color schemes from any frame or photo;
    • introduce a toggle between RGB and LAB for more perceptually accurate matching;
    • and break down how color harmony works — and how you can use it in real-world grading.
  • we’ll expand CinePalette with the ability to pick a palette from a reference image;
  • CinePalette
  • add automatic extraction of color schemes from any frame or photo;
  • introduce a toggle between RGB and LAB for more perceptually accurate matching;
  • RGB and LAB
  • and break down how color harmony works — and how you can use it in real-world grading.
  • color harmony


    Stay tuned — and get ready to not just learn color, but truly see it.

    Stay tuned — and get ready to not just learn color, but truly see it.see

    Trending Topics

    blockchaincryptocurrencyhackernoon-top-storyprogrammingsoftware-developmenttechnologystartuphackernoon-booksBitcoinbooks