Why Smartphone Night Photos Are So Good Now
Released on 03/25/2022
Taking photos at night on your phone
used to look terrible.
But if you purchased a new smartphone recently,
you may have noticed that your night photos have improved.
Ah, much better.
You can even take photos of stars.
I'm Julian Chokkattu, reviews editor at Wired,
and I've been reviewing smartphones for over five years.
How has smartphone photography gone from this,
to this beautiful photo?
Before we get into the technology
behind the new night modes,
let's first have a little chat about bad photos.
Take a look at this photo here,
taken on an iPhone Five around 2014.
A couple elements stand out to me,
like that classic lens flare, or the blur.
No matter how nice or advanced the camera is,
it's always going to need a good source of light.
That's exposure, the amount of light
that reaches your camera sensor.
Right now, this lovely crew has lit me really well.
Let me show you.
[soft music]
If they cut the lights, now I'm back lit and underexposed.
This is the iPhone 3G in low light,
and this is the iPhone 13 Pro in low light.
Let's get the lights back on.
Part of the reason the iPhone 3G looks so underexposed
is because it didn't spend a lot of time taking the photo.
That's shutter speed.
That's the length of time the camera's little door is open,
exposing light onto the camera sensor.
One of the main reasons night mode on your phone
asks you to stay still is because
the longer you have the shutter open,
the more light you can let in,
which will produce a brighter photo.
But here's the thing.
In night photos, the seconds it's asking you to wait,
it's actually taking more and more photos
to make a composite with machine learning algorithms.
So night mode is a part of the field
of computational photography.
I'm going to call up Ramesh Raskar at the MIT Photo Lab
to get into the technical element of how it works.
[Ramesh] Hi Julian.
Would you be able to tell me
what exactly is happening when you take a night photo
in a modern day smartphone?
There are three elements in any photography.
There is capture, there is process,
and then there's display.
And what we have seen over the last 10 years
is there is amazing improvement in all three areas.
So how is the software actually changing
what the photo will look like?
You will hear all these terms, HDR, HDR plus, night mode,
smart HDR, but all of them are roughly doing the same thing.
This key idea of so-called deep fusion,
where you're fusing the photos by using machine learning
and computer vision, is really the breakthrough
into today's low light photography.
Could you explain HDR?
So HDR, traditionally high dynamic range, simply means
whether it's bright scene or a dark scene,
you can capture that in a single photo.
A smartphone, it has seen millions of photos of a sunset,
or a food, or a human face.
It has learned over time, what are the best ways
to enhance such a photo, and how to either
reduce the graininess, or how to make it look more vibrant
and choose the right saturation?
Choosing those parameters is basically machine learning
when it comes to photography.
Now let's take a look at this machine learning in action
by comparing some photos.
The one on the left is the iPhone 3G,
so quite a long time ago.
And the one on the right is the iPhone 12.
What are your first thoughts
in what they're doing differently?
So you can see that the previous phones
just gave you a photo from a single instant.
The photo on the right is actually not physically real,
in the sense that there were different things.
People were bobbling their heads,
and the lights were flashing.
And so the photo's actually composed by multiple instances.
So when you try to fuse these multiple photos,
the light in one photo could be one direction,
light in the later photo could be in a different direction.
And it's taking some clever decisions
to create an illusion, as if this photo was taken
at that single instant.
Here you can also see the HDR into effect,
where the audience is completely dark
in the iPhone 3G photo, whereas you can actually see
everyone's heads in the other one.
If an AI is learning how to color correct a night scene
based on what it thinks it should be,
are we moving away from photo realism?
Julian, I think photo realism is dead.
We should just bury it, and it's all about hallucination.
The photo you get today has almost nothing to do
with what the physics of the photo says.
It's all based on what these companies are saying
the photo should look like.
So yeah, I took one of these with the Pixel Six
and one of these with the iPhone 13 Max Pro.
What happened there that would've caused those colors
to be very different between the two photos?
These two companies have decided to give you
a very different photo experience.
The Pixel might have taken 20 photos.
It's also recognizing certain features
whether there's a sky, is it outdoor?
What kind of wide balance it has?
There's some automatic beautification also being applied.
So most of the photos we see are hallucinations,
but not the physical representation of the world out there.
These companies are providing us with ways to
control some of that, like turn off
that beautification feature or maybe make it even stronger.
Do you think that's where the compromise will lie
with the people that do want to maybe
tailor some of their own shots to give them that control,
and those options to tweak their settings?
The innovations in all these three areas
have actually taken the control away from us.
But in reality, it's not that difficult
for these companies to provide those controls back to us.
They're just making an assumption
that most consumers would like to just take a photo,
click a button, and get something they really
would like to see, whether it matches the reality or not.
I think the thing that we really care about is
we go on a trip, and you reach Paris,
and the Eiffel Tower is in a haze.
And what you would like to see is take a photo
with your family with Eiffel Tower in the back
as if it's a bright sunny day, right?
And that's where as a consumer,
you yourself are willing to separate the physics,
the reality from hallucination,
because if somebody can paste just a bright, sunny photo
of Eiffel Tower behind your family,
you'll be pretty happy about it.
So we focused on night photography.
Every time we look at the nighttime photos,
those actually do seem to be improving year over year.
But broadly, what would you say are some of those challenges
that are left for photography in general
when it comes to smartphones?
In terms of night mode,
there are lots of challenges right now.
If you want do something that's high speed,
it's very difficult to capture that at nighttime.
It's also difficult to capture very good color
in nighttime, because nighttime photos have,
when they use burst mode, the challenge with burst mode
is that every frame has a so-called read noise.
So there's a cost a camera pays
every time it reads the photos.
But the other technique many companies are using
is just using lots of tiny lenses.
Now some phone companies have five lenses,
and that's one trick to capture just five times more light.
How does that affect the rest of the phone's capabilities?
What can we expect in the future?
Photography or imaging should give us superhuman powers,
so we should be able to see through fog,
we should be able to see through rain.
we should be able to see a butterfly
and see all the spectrums, not just the three colors.
I think the notion that
we should just see what we are seemingly experiencing
is not in different displays,
but I would like to see a beautiful view finder.
If I'm in Paris and as I'm moving my view finder,
it should tell me, hey, if I take a picture
of the Eiffel Tower, it's very jaded.
A lot of people are out taking a photo.
But if you keep rolling and there is this tiny statue,
actually not enough people have taken the photo of this.
So I think we're going to see this very interesting progress
in capture, processing, and display.
And I'm very excited about
what photography of tomorrow will look like.
[soft music]
I'm going to show you some of my favorite features
with the iPhone 13 Pro and the Google Pixel Six.
We're doing low light photography, so let's cut the lights.
Let's open up the camera
and see what happens with night mode.
You can see that I'm already in a pretty dark area,
so night mode has been triggered here.
Once you tap it,
you can actually control the length of the exposure.
So if you think that you might need a longer shot,
sometimes that might produce a brighter image.
If I tap on the background, it'll expose for the background
and it will also change the focus there.
So you can actually slide it up and down
to change the brightness, or the shadows in the shot.
Those are just a couple of features
in the camera app themselves.
All right, let's bring the lights back on.
So we have to talk about tripods.
Tripods are an easy way to up your photo game,
especially at night.
Of course, a large problem of taking photos at night
is the hand shake of when you're taking a photo.
Once more, can we cut the lights?
Can I get a volunteer?
So now I'm going to first take a photo without a tripod,
and see how it reacts then.
So you can just basically switch over to the night site mode
and tap the photo.
But now if I switch over to a tripod,
it's going to be much more stable.
And if I tap the button, it knows that it's on a tripod,
and you can see it is taking a lot longer to take the photo.
It's taking multiple, multiple images
of different exposures.
Shooting handheld is a problem, because the shutter speed
is trying to take in as much light as possible.
And that means your hands are shaking,
and that's influencing the shot.
That's what makes it impossible
taking photos of stars without a tripod.
Certain phones like the Pixel Six
let you take photos of the star
with a certain astrophotography mode.
And essentially it's doing what night mode is doing,
but for a much longer period of time,
like two, three, sometimes even five minutes.
And what it really needs is the phone to be on a tripod.
If you're curious about what some of our favorite phones are
for taking photos, or maybe just looking at
other camera gear that might help you take
some of these better photos,
well, we have guides on wired.com.
And as Ramesh said, it's going to be really interesting
to see how our cameras improve in the future,
whether they'll completely decide on their own
exactly what photo you should take,
or if you'll have any control left.
Photo realism is dead.
No, that's dark.
Jesus.
I hope this video helped you understand a little bit more
about night photography, and I hope
you continue going out there taking lots and lots of photos.
[soft music]
How the Disco Clam Uses Light to Fight Super-Strong Predators
Architect Explains How Homes Could be 3D Printed on Mars and Earth
Scientist Explains How Rare Genetics Allow Some to Sleep Only 4 Hours a Night
Scientist Explains Unsinkable Metal That Could Prevent Disasters at Sea
Is Invisibility Possible? An Inventor and a Physicist Explain
Scientist Explains Why Her Lab Taught Rats to Drive Tiny Cars
Mycologist Explains How a Slime Mold Can Solve Mazes
How the Two-Hour Marathon Limit Was Broken
Research Suggests Cats Like Their Owners as Much as Dogs
Researcher Explains Deepfake Videos
Scientist Explains How to Study the Metabolism of Ultra High Flying Geese
Hurricane Hunter Explains How They Track and Predict Hurricanes
Scientist Explains Viral Fish Cannon Video
A Biohacker Explains Why He Turned His Leg Into a Hotspot
Scientist Explains What Water Pooling in Kilauea's Volcanic Crater Means
Bill Nye Explains the Science Behind Solar Sailing
Vision Scientist Explains Why These Praying Mantises Are Wearing 3D Glasses
Why Some Cities Are Banning Facial Recognition Technology
Scientist's Map Explains Climate Change
Scientist Explains How Moon Mining Would Work
Scientist Explains How She Captured Rare Footage of a Giant Squid
Doctor Explains How Sunscreen Affects Your Body
Stranger Things is Getting a New Mall! But Today Malls Are Dying. What Happened?
The Limits of Human Endurance Might Be Our Guts
Meet the First College Students to Launch a Rocket Into Space
Scientist Explains Why Dogs Can Smell Better Than Robots
A Harvard Professor Explains What the Avengers Can Teach Us About Philosophy
NASA Twin Study: How Space Changes Our Bodies
What the Black Hole Picture Means for Researchers
Scientist Explains How to Levitate Objects With Sound
Why Scientists and Artists Want The Blackest Substances on Earth
Biologist Explains How Drones Catching Whale "Snot" Helps Research
Researcher Explains Why Humans Can't Spot Real-Life Deepfake Masks
Doctor Explains What You Need to Know About The Coronavirus
VFX Artist Breaks Down This Year's Best Visual Effects Nominees
How Doctors on Earth Treated a Blood Clot in Space
Scientist Explains Why Some Cats Eat Human Corpses
Voting Expert Explains How Voting Technology Will Impact the 2020 Election
Doctor Explains What You Need to Know About Pandemics
ER Doctor Explains How They're Handling Covid-19
Why This Taste Map Is Wrong
Q&A: What's Next for the Coronavirus Pandemic?
Why Captive Tigers Can’t Be Reintroduced to the Wild
How Covid-19 Immunity Compares to Other Diseases
5 Mistakes to Avoid as We Try to Stop Covid-19
How This Emergency Ventilator Could Keep Covid-19 Patients Alive
Why NASA Made a Helicopter for Mars
Theoretical Physicist Breaks Down the Marvel Multiverse
Former NASA Astronaut Explains Jeff Bezos's Space Flight
Physics Student Breaks Down Gymnastics Physics
What Do Cities Look Like Under a Microscope?
Inside the Largest Bitcoin Mine in The U.S.
How Caffeine Has Fueled History
How Mushroom Time-Lapses Are Filmed
Why You’ll Fail the Milk Crate Challenge
Why Vegan Cheese Doesn't Melt
How 250 Cameras Filmed Neill Blomkamp's Demonic
How Meme Detectives Stop NFT Fraud
How Disney Designed a Robotic Spider-Man
How Online Conspiracy Groups Compare to Cults
Dune Costume Designers Break Down Dune’s Stillsuits
Korean Phrases You Missed in 'Squid Game'
Why Scientists Are Stress Testing Tardigrades
Every Prototype that Led to a Realistic Prosthetic Arm
Why the Toilet Needs an Upgrade
How Animals Are Evolving Because of Climate Change
How Stop-Motion Movies Are Animated at Aardman
Astronomer Explains How NASA Detects Asteroids
Are We Living In A Simulation?
Inside the Journey of a Shipping Container (And Why the Supply Chain Is So Backed Up)
The Science of Slow Aging
How Nose Swabs Detect New Covid-19 Strains
Samsung S22 Ultra Explained in 3 Minutes
The Science Behind Elon Musk’s Neuralink Brain Chip
Every Prototype to Make a Humanoid Robot
Chemist Breaks Down How At-Home Covid Tests Work
A Timeline of Russian Cyberattacks on Ukraine
VFX Artist Breaks Down Oscar-Nominated CGI
Why Smartphone Night Photos Are So Good Now
We Invented the Perfect WIRED Autocomplete Glue
How Everything Everywhere All at Once's Visual Effects Were Made
How Dogs Coevolved with Humans
How an Architect Redesigns NYC Streets
Viking Expert Breaks Down The Northman Weapons
J. Kenji López-Alt Breaks Down the Science of Stir-Fry
How A.I. Is Changing Hollywood
How Trash Goes From Garbage Cans to Landfills
Veterinarian Explains How to Prevent Pet Separation Anxiety
The Science Behind Genetically Modified Mosquitoes
How Scientists & Filmmakers Brought Prehistoric Planet's Dinosaurs to Life
All the Ways Google Gets Street View Images
How Public Cameras Recognize and Track You
How the Nuro Robotic Delivery Car Was Built
Biologist Explains the Unexpected Origins of Feathers in Fashion
Surgeons Break Down Separating Conjoined Twins
Former Air Force Pilot Breaks Down UFO Footage
Bug Expert Explains Why Cicadas Are So Loud
The Best of CES 2021
Health Expert Explains What You Need to Know About Quarantines
Scientist Explains How People Might Hibernate Like Bears
Could a Chernobyl Level Nuclear Disaster Happen in the US?
Neuroscientist Explains ASMR's Effects on the Brain & The Body
Why Top Scientists Are Pretending an Asteroid is Headed for Earth
Epidemiologist Answers Common Monkeypox Questions
Bill Nye Breaks Down Webb Telescope Space Images
How This Humanoid Robot Diver Was Designed
Every Trick a Pro GeoGuessr Player Uses to Win
How NASA Biologists Plan to Grow Plants on the Moon
How FIFA Graphics & Gameplay Are Evolving (1993 - 2023)
How a Vet Performs Dangerous Surgeries on Wild Animals
This Heart is Not Human
How Entomologists Use Insects to Solve Crimes
Former NASA Astronaut Breaks Down a Rocket Launch
Chess Pro Explains How to Spot Cheaters
Why Billionaires Are Actually Ruining the Economy
How to Keep Your New Year’s Resolutions for More Than a Week
The Biology Behind The Last of Us
English Teacher Grades Homework By ChatGPT
All the Ways a Cold Plunge Affects the Body
Spy Historian Debunks Chinese Spy Balloon Theories
A.I. Tries 20 Jobs | WIRED
Mathematician Breaks Down the Best Ways to Win the Lottery
Why Music Festivals Sound Better Than Ever
Pro Interpreters vs. AI Challenge: Who Translates Faster and Better?
Why The Average Human Couldn't Drive An F1 Car
Atomic Expert Explains "Oppenheimer" Bomb Scenes
Every 'Useless' Body Part Explained From Head to Toe
How Pilots and Scientists Are Thinking About the Future of Air Travel
How To Max Out At Every Fantasy Football Position (Ft. Matthew Berry)
All The Ways Mt. Everest Can Kill You
How Fat Bears Bulk Up To Hibernate (And Why We Love To See It)
Why Vintage Tech Is So Valuable To Collectors
8 Photos That Tell The History of Humans In Space
How Every Organ in Your Body Ages From Head to Toe
Why AI Chess Bots Are Virtually Unbeatable (ft. GothamChess)
How Mind-Controlled Bionic Arms Fuse To The Body
Historian Breaks Down Napoleon's Battle Tactics