06
02
From "Flick!"
The iPhone 13 has the same appearance as 12 at first glance, but has evolved significantly.The internal structure is a completely new design.
Was only the camera evolved?Or does the evolution of the camera change our lives significantly?I don't know what Apple thinks, but anyway, the feature of the iPhone 13 series is in the camera.Posting to SNS, shooting videos, exchanges of photos and videos with family members ... How do the high -functioning and multifunctional cameras change our lives?The iPhone 13 series, which will be near us, will prove that for that year.I guess some people say, "I have an iPhone that I have because it's 12 million pixels," and "this function is already installed on Android mobile phones," but I first read this article to the end.I want you to think about it.
From "Flick!"
Almost all lenses and sensors have been greatly improved in the iPhone 13, but the best merit that supports the iPhone 13 image quality is the A15 Bionic chip.6 core CPU, 4 cores (5 cores for Pro), which are faster than conventional ones, and 15 per second.Equipped with a neural engine that can perform 8 trillion operations and a new image processor.In recent iPhones, while our users thought they had taken one cut, they took several pictures, chose the cuts with the least blur, and took several photos, and the images of the highlights of the highlights.The images of the place, each of the images taken with a specific color for each, and makes use of them to create one photo.
What SLR is no longer beaten ... The iPhone is "recognizing what is shooting."If the sky is shown in it, remove the noise so that the blue of the sky appears beautifully, and the blue is vivid.If the food is shown, it will be bright and bright so that it looks delicious.If the face of a person is shown, the skin will be smooth.The A15 Bionic chip processes what is shown closer to the state where people are more recognized.This is not a real thing that is not real.For example, even if you shoot with a film, the light that hits the object senses through the lens, but depending on the nature of the photosensitive emulsion, the reaction is different, so if it is a negative color, it will burn more paper.Sometimes, the finished photos differ depending on how they are processed.The brightness, darkness, colorful, surface texture, etc. of the actual object directly cannot be reproduced.Ultimately, we make the image in the brain, but I think Apple's photos and videos are to bring this image as close as possible to see the real thing.
次ページは:従来のカメラでは不可能な、それぞれの人に合わせた露出Page 1/5