Inclusive Smartphone Cameras: The Line Between Innovation and Reputation
Google and Apple have made smartphone cameras that embrace diversity, but are their intentions as pure as they seem?
When you buy a new phone a lot of thoughts enter your mind. Is the screen big enough, or maybe is the screen too big? You may consider features like battery life or processor speed. A lot of people make considerations for the camera of their potential smartphone purchase. As a computer in our pocket, the smartphone has rendered many other gadgets useless. One of those devices is the point-and-shoot camera. Our phones have now become our primary photo-taking and video shooting tools.
Perhaps this evolution of consumer behavior is why improvements to the camera are the focal point of every new smartphone announcement regardless of manufacturer. Capturing life’s moments has become the stock and trade of the modern smartphone. And despite how far we have come in mobile photography, there is much more to be accomplished. Smartphone cameras have notoriously been chronically poor at capturing subjects with darker skin tones, creating inaccurate results and washed out facial features. This has been rampant yet underemphasized. With the advent of AI and Machine Learning photography, Google and Apple are attempting to change this reality.
Redefining the Antiquated Process
Photography and photo science is something that is equal parts simple and complicated. On the surface, the idea seems simple: line up the shot and take a photo. But when it comes to color photography, there are so many things to consider to get the perfect shot. If you have ever seen a professional photographer editing their work after taking the shot, you will understand how complicated it can be.
So now that Google is making claims that the Pixel 6 is the world’s “most inclusive” smartphone camera it is important to understand why they have made this a focus. For decades, color photography has been skewed to emulate “whiteness”, due to the chemistry associated with developing film. This was the case even as film cameras have become less common in favor of digital ones. The color science and chemistry remained the same, therefore people with darker complexions were never able to get the right photo.
The frustrating thing about this for people of color is that the introduction of color film in 1861 was the accepted process for years and years, a standard that was skewed against them. The introduction of AI models in photography has enabled this standard to change in mobile photography. Both Apple and Google have made this a focus on their flagship devices this year. Google’s solution is called Real Tone, a set of algorithms that are focused on recognizing different skin tones to produce the best possible shot. These algorithms focus on more nuanced white balance, improved auto exposure, and reduced stray light among other things.
Apple has implemented a similar machine-learning model in Smart HDR4 in the iPhone 13. This algorithm is trained to identify faces and to differentiate between skin tones, producing the most accurate photos ever of people of color on the iPhone. As evidenced by this tweet, users of the iPhone 13 are starting to notice this improvement and are very appreciative of it.
Why This Is Important
Many people think that smartphone innovation has stagnated. When considering hardware only, this is probably true. But software is an area where innovation can still happen. Google and Apple have shown this with an emphasis on having cameras that bring out the best results for everyone. This is an innovation that brings positive advancement that everyone can notice as opposed to niche features that only tech enthusiasts will appreciate.
But beyond the feature itself, there is a deeper message here. An awakening by two of the major tech giants in the world. The core issue is that tech as a whole has had a diversity problem for a while. Despite all of the grandiose claims of expanding the world through innovative technology and advertising an image of inclusion, tech remains largely male and white.
Currently, only 20% of positions held in tech are held by women. Less than half of tech jobs are held by minorities. What’s more, women’s jobs in computer science positions have declined since the 1990s (Data for these statistics can be found here). Tech is missing prominent voices from a variety of underrepresented groups and it has become more evident that this is the case in recent years.
In addition to this, there have been reports of these same tech giants that many adore being out of touch with their employees. Two years ago, Google dealt with workers going on strike as a result of the companies stance on building a search engine for the Chinese government and its handling of sexual misconduct allegations by prominent executives. This year, Apple has faced similar backlash from its employees over controversial hiring decisions and pay disparities. To put it plainly, tech companies paint an image of themselves that is often far from reality. What does this have to do with Apple and Google branding their new smartphone cameras as inclusive? A matter of reputation reconstruction.
Reputation is Everything
The way we view major tech companies these days has shifted. Facebook and Amazon were once great success stories of the internet age, but these days they are labeled as evil, greedy corporations. Apple and Google, once the hip tech companies that understood the world are suddenly becoming more and more out of touch as they become more prominent. There is a certain tone-deafness to these companies that becomes louder and louder, making it nearly impossible to ignore.
The way that a company changes this sort of narrative is by making a change that is very public-facing and understandable. In this case, the two tech giants realized that the camera is the most important part of the modern smartphone experience for most people. They also realized the reality of the camera being historically antiquated for people of color. The solution had now been presented for them to start a rehabilitation of their image with those that had started to lose faith in them.
Making a camera that can bring out the best photos regardless of skin tone and complexion is a way to regain that faith. Google and Apple position themselves as tech companies with a soul. Where others make high-tech machines that are supposed to be awe-inspiring feats of machinery and engineering, these two build products catered to speak to the emotional side of people. The way that they advertise often shows the Pixel and iPhone as a lifestyle product, a companion as opposed to a gadget.
This companion nature and messaging fit perfectly with emphasizing the inclusivity of a smartphone camera. Apple and Google are saying here that their camera understands you regardless of what you look like or where you are from. And in particular, it is a direct contrast to the manufacturers from Asia (like Samsung and Xiaomi) that they compete against.
Photography interpretations in the East and the West are vastly different. In Asian countries, many prefer the boosted saturation and enhanced face smoothing features that those OEMs provide. Asian culture in recent years has come to romanticize whitening and smoothing of the skin so these features make sense coming from phone makers based in China and South Korea. In Western countries, however, the preference is more about true-to-life results. To see a photo as your eye saw it and envisioned it before you took the shot.
This is why it is Apple and Google, and not Samsung or OnePlus, that is advocating for these features that promote inclusion. They are in line with the culture of their primary demographic and align with the ideals that both companies strive to live up to. The United States is a major market for both Apple and Google, and they understand the diversity that comes with that market. Both companies realizing that a change needed to be made in their computational photography is long overdue, but appreciated nonetheless. Even if their true intentions are a little less than genuine.