A long time ago before most of us were born, the science of the measurement of light had become quite advanced, sensitive instruments and clever procedures could very precisely distinguish one wavelength of light from another and could measure all the wavelengths in a spectrum. Knowledge of the eye too had advanced, and clever and precise methods relying on both instruments and people's abilities to judge what they were seeing led to quite good descriptions of how the retina, via the rods and cones, could turn a certain amount of light into a certain amount of retinal activity.
These retinal response curves led to the mathematical RGB tristimulus model we are all familiar with, each of the red, green, and blue curves corresponding very well to the response of the retina to light. Unsurprisingly, those corresponds very well also to the distribution of light in the environment, and there's good quantum mechanical reasons for that, Hydrogen gloes red, Oxygen blue, Nitrogen leaning to green. Our eyes are made from atoms after all, so this all makes some kind of sense.
Now, in order to that groups of scientists could work together to build up that model and derive accurate curves, those scientists need a yardstick for light, some way to know that if they shine a light on something, and they look at that something in their experiments, they could all be sure they were looking at the same thing. So they came up with a reference light that anyone could construct themselves, and some chemical recipes that could be mixed in bottles that could filter the light to get a particular distribution of wavelenghts. The formula gave them something similar to clear daylight near their laboratory, wherever that was, at some time of a day, maybe lunch time.
After a long series of experiments involving completely sober male grad students in Cambridge and no hijinks whatsoever wherein the students were presented with a test color — typically a narrowband spectral light source — and asked to match it by adjusting the intensities of three fixed primary lights (red, green, and blue). If the primaries alone could not produce a match, one of them was added to the test field instead, effectively requiring a negative amount of that color in the mixture. This technique led to the definition of the standard RGB color matching functions, which describe how different wavelengths of light stimulate the three types of cone cells in the human eye There are more refined variants from studies done later including people from other regions, and also ladies, and those curves are very, very similar, but typically with more sensitivity to blue, although not enough to really upset the mathematics, which is why we end up sticking with the old curves.
Normalizing the tristimuls values projects this three dimensional color space into a two dimensional chromaticity plane where hue and saturation are represented, but not luminance. This equi-illuminant slice fits precisely into a unit triangle with red down near the origin, green up near 1.0 at top of the y axis and blue way over near 1.0 on the x axis. The pure spectral colors from red to green climb up the diagonal of that unit triangle, and the colors from green to blue slide down the y axis as closely as they can. This space is mathematically very convenient but doesn't have a perceptual metric. A decade later, MacAdam performed a new series of experiments where participants adjusted test colors until they were “just noticeably different” from a reference. He plotted the results as MacAdam ellipses, showing that perceptual color differences were not uniform in the CIE (x,y) space. These ellipses revealed that some colors (e.g., greens) had much finer discrimination than others (e.g., blues). This led to later refinements, including:
CIE 1960 introduced the UCS (Uniform Chromaticity Scale), an attempt to improve perceptual uniformity. CIE 1976 introduced the (u', v') space, which is still used in HDR and gamut mapping, and also CIELAB (L, a, b*) which aims to improve perceptual uniformity via a nonlinear transform.
Today we are looking at color science from the perspective of computer graphics which brings us to the first thing you will encounter, people are always talking about Rec709, the 709th Recommendation of the International Telecommunication Union Radiocommunication Sector. Rec709 was a digital standard that described how values, typically coming from analog sensors in those days, could be precisely encoded for use in production workflows. Rec709 defined color primaries and encoding parameters, and it also described an opto-electrical transfer function (OETF) to describe how light would be transformed into an electrical signal. We don't use the OETF in our calculations, unless we are working with values that were recorded from a camera.
When you look at a CIE colorimetry chart with various color gamuts plotted, the smallest triangle is Rec709. The points of the triangle correspond to the measured coordinates of the red, green, and blue sensitivity peaks typical of cameras of the time. Since these coordinates are relative to the equal value point, you can see immediately how the triangle actually gives you a change of basis between different color spaces ~ you can get from one color space to another by constructing a matrix that scales and shifts the values to line up on the chart and so you can do math on them.
That triangle was very small, but it was achievable by manufacturers, and by having standard calibrations, we could aim for uniform colors as videos were processed throughout production workflows.
We don't actually ever look at things producing Rec709 output though, Rec709 is all about lightwaves to numbers. That's where Rec1886 comes in. It's the 1886th recommendation of the International Telecommunication Union Radiocommunication Sector. It's often referred to casually as BT1886, because Rec1886 is all about Broadcast Television; similarly the digital encoding for broadcast television is casually referred to as BT709. For those unfamiliar, the word television is a portmanteau of the Greek word "tele" (τῆλε), meaning "far", and the Latin word "visio", meaning "sight" or "seeing". When you hear people talking about a television, that referred to a device found in many homes that could receive modulated electromagnetic waves through the air and convert them to light waves emitting from the surface of a screen via a great variety of exceedingly clever mechanical and electrical contrivances.
So, back to BT1886. BT1886 specified an OETF as did Rec709, which describes how a camera can covert a scene's light into digital signal values. The EOTF described how to turn those values back into light. The BT1886 was designed to mimic the behavior of old CRT displays which used to have an exponential response to electricity, the so called gamma curve. The vagaries of manufacture and the limitations of physics made that an examponent of 2.4. As it is difficult to examine a CRT outside of a museum or perhaps my garage, a brief explanation is in order. Cathode Ray Tubes were a wonderful contraption that combined exotic chemistry, beautiful blown glass and delicate wire sculpture, and a deadly electron gun that shot tens of thousands of electron volts directly at a viewer. Luckily the glass plate and phosphors at the end of the tube intercepted that energy and converted it to light before it could burn or ionize an audience!
Color tubes were standardized in the mid nineteen fifties. The NTSC standards still in play today date to 1953, and wild electrical considerations such as a divide by 1.001 on frequencies are still lovingly emulated by practitioners today as a nod to the wonder the world felt at broadcast television moving from monochrome to color.
BT1886 enshrined this vivid nostalgia by describing how to convert a digital encoded color signal to a beautifully emulated ideal television from 1953, because as is well known, that was in some regard perfect.
Enter sRGB. The "s" stands for "standard", by the way. Consumer grade CRT monitors in the modern era had a 2.2 gamma curve, although consumers certainly meedled with that using the various twiddly controls connected to the monitor's analog electronics. sRGB is aimed at both an OETF and an EOTF, and is focussed on the fact that computer frame buffers typically try to store rgb values in eight bits per component. The 2.2 gamma curve has the nice property that it matches what a monitor does with an electrical signal, and also shifts colors around so that more bits are dedicated to color values that we can perceive differences in, so it's in a sense, a win-win. In graphics APIs, sRGB values are converted to linear for computation, then converted back to sRGB for storage in buffers.
The thing that ties all these things together is the shared color primaries, and that's why you often hear Rec709, BT1886, and sRGB used somewhat interchangeably. The thing to keep in mind is that Rec709 reflects the characteristics of analog camera technology evolved from 1950s era color video cameras, BT1886 is all about emulating classic cathode ray tube technology, and sRGB is all about accomodating the characteristics of twentieth century CRT computer monitors.
In the modern era, there are a great many color spaces for modern applications that far exceed the capabilities of vintage analog electronics. These include Rec2020, ACES, and many more. The beauty of the neutral CIEXYZ-1931 space is that it allows us to exactly move colors, including those from historic sources into any modern gamut we choose.