Finished reading? Continue your journey in Tech with these hand-picked guides and tutorials.
Boost your workflow with our browser-based tools
Share your expertise with our readers. TrueSolvers accepts in-depth, independently researched articles on technology, AI, and software development from qualified contributors.
TrueSolvers is an independent technology publisher with a professional editorial team. Every article is independently researched, sourced from primary documentation, and cross-checked before publication.
Apple's iPhone 18 Pro will introduce variable aperture to the main camera, allowing physical lens adjustments that control light and depth of field. This mechanical feature matches capabilities professional photographers have used for decades, bringing optical control beyond what computational photography can replicate. The technology finally gives iPhone users genuine creative authority over how photos look.

Every iPhone from the 14 Pro through the current iPhone 17 Pro carries the same fundamental camera constraint: a fixed main lens opening that never changes. The iPhone 17 runs a fixed f/1.6 aperture locked at one size whether the scene is a noon street or a candlelit dinner. Variable aperture changes that. The iPhone 18 Pro will introduce a lens that physically adjusts its opening size, giving the camera mechanical control over light and depth of field that no software update has ever been able to provide.
That single f-number defines how much light reaches the sensor, and it never adjusts. For photography, Apple's computational engine compensates well enough. The Camera app adjusts shutter speed, raises ISO, and applies software processing to manage exposure. In most lighting conditions, the results hold up. The constraint becomes harder to hide in video.
Cinematic video follows the 180-degree shutter rule, a standard built into the physics of how motion blur looks natural to the human eye. The rule requires shutter speed to stay at double the frame rate, meaning 1/48 second at 24fps and 1/120 second at 60fps. In bright outdoor light with a fixed f/1.6 aperture, achieving the correct exposure at those shutter speeds is physically impossible without overexposure. The camera has two options: push shutter speed up to avoid blown highlights, or raise ISO to compensate. Faster shutter speeds create the hyper-sharp, jittery look that distinguishes amateur phone video from professional footage. Higher ISO introduces noise. Neither option is neutral.
From what we've gathered during our research, this is the part of the iPhone camera story that never fully surfaced in mainstream coverage. The fixed aperture forces a shutter speed compromise that no software update, no computational photography feature, and no post-production tool can retroactively fix. Once footage is captured with wrong shutter-speed motion blur, that's what it is.
Ming-Chi Kuo first reported variable aperture for both iPhone 18 Pro models in December 2024, with an iPhone 17 release having been explored and set aside. The September 2026 target gives Apple's supply chain partners the runway needed to reach production scale. The feature is finally arriving because the engineering required to build it reliably in a smartphone has caught up with the need.
Variable aperture lets the camera physically expand or restrict the lens opening, moving between a wider setting that admits more light with shallower depth of field, and a narrower setting that restricts light while keeping more of the scene in sharp focus. This is the same mechanical system professional photographers have used on DSLR and mirrorless cameras for decades. On a dedicated camera body, aperture is often the first control a photographer learns to use deliberately.
On a smartphone, the dynamics work a little differently, and honest coverage requires acknowledging where the physics limit what variable aperture can deliver.
The lens opening on any camera creates depth of field based on two factors working together: the actual physical size of that opening and the physical size of the sensor behind it. A full-frame camera at f/1.8 has a large physical aperture opening because the lens is large. An iPhone at f/1.6 has a tiny physical opening because the lens is tiny. The result is that depth-of-field control on an iPhone is constrained by sensor size regardless of what the f-number says. Variable aperture cannot produce shallower background blur than what f/1.6 already generates on a small sensor. It can only stop down, increasing depth of field and reducing incoming light.
After examining the physics more carefully, the conclusion is that variable aperture's most visible gains on an iPhone will come from the narrow end of the range, not the wide end. Stopping down keeps a landscape sharp from foreground rocks to distant ridgelines. It keeps all faces in a group portrait within the plane of focus rather than softening the edges. It maintains clarity across a product shot where a wide aperture would blur the near edge of the subject. These are genuine improvements. Portrait bokeh that approaches the separation of a 50mm lens on a full-frame body is not what is being delivered here, and coverage that implies otherwise sets expectations the hardware cannot meet.
Where variable aperture does outperform computational photography is in accuracy. Software-based portrait mode uses depth estimation algorithms to decide what to blur. Those algorithms make mistakes: hair edges develop halos, objects at similar distances get treated inconsistently, and backgrounds behind glasses or complex foregrounds render with artifacts. Optical depth of field does not guess. The blur it produces reflects real physics rather than an algorithm's interpretation of a depth map, which means it renders correctly in ways that software cannot always replicate.
The macro use case deserves more attention than it typically receives. At close focus distances, even iPhone sensors produce very thin planes of focus at wide apertures. A product shot, a flower detail, or a piece of jewelry captured up close can lose its front or rear edge to blur at f/1.6. Stopping down even modestly recovers that sharpness without requiring recomposition or distance adjustment.
The history of variable aperture in smartphones is shorter and more instructive than most people realize. Samsung introduced a dual-aperture system on the Galaxy S9 in 2018. The system switched between f/1.5 in low light and f/2.4 in brighter conditions before Samsung dropped it with the Galaxy S20 in 2020, citing increased device thickness and higher manufacturing costs. The S10 and Note 10 continued the feature in between, giving the technology roughly two product generations before Samsung moved on.
Xiaomi revived the concept with more sophistication in 2024. The Xiaomi 14 Ultra became the first phone with a one-inch sensor to offer stepless variable aperture, with a continuous range from f/1.63 to f/4.0. Rather than Samsung's two-position switch, Xiaomi built a system that adjusted continuously, much like a dedicated camera's aperture ring. Reviewers noted the stepless control proved particularly useful for video, exactly as the physics would predict. Then, in 2025, Xiaomi released the 15 Ultra with the feature removed entirely, returning to a fixed f/1.63 main camera in favor of telephoto zoom improvements.
In our analysis of the track record, the pattern is consistent enough to take seriously: no manufacturer that has introduced smartphone variable aperture has kept it beyond one product cycle. Samsung dropped it. Xiaomi dropped it after one generation. Our interpretation of that pattern is not that the feature failed users, the reviews in both cases were positive on the use cases it addressed. The manufacturers dropped it because hardware trade-offs, manufacturing complexity, and product prioritization decisions outweighed the benefit within their specific roadmaps and price constraints.
Apple enters this history with a different set of constraints. Its supply chain integration runs deeper, its development timeline is longer, and its software ecosystem is built to abstract hardware complexity in ways that neither Samsung nor Xiaomi fully leveraged for this particular feature. Whether Apple bucks the one-generation pattern depends less on the hardware than on what the software layer does with it.
Samsung's 2018 system relied on voice coil motor actuators to move the aperture blades. VCM systems require a continuous electrical current to hold any position other than their default. Maintaining an aperture setting draws power constantly, and the magnetic nature of voice coil motors creates electromagnetic interference with nearby components. The physical size required for these mechanisms contributed to the camera bump thickness that Samsung eventually cited as a reason for discontinuation.
The actuator technology available in 2026 is categorically different. Shape Memory Alloy actuators, developed by Cambridge Mechatronics among others, use extremely fine metal filaments that respond to electrical heating by contracting into a target position, then returning to their default shape as they cool. The key property is that once the filament reaches the target configuration, no ongoing power is required to maintain it. In a typical three-minute camera session with five aperture adjustments, SMA actuators with Zero Hold Power technology reduce average current draw from over 25mA to under 0.1mA compared to VCM systems, a reduction of more than 99%.
Beyond power consumption, SMA actuators are electromagnetically neutral, eliminating the interference concerns, and their slimmer physical profile fits more naturally within the constraints of a modern smartphone camera module. They are also engineered for millions of adjustment cycles, addressing the durability concern that follows any mechanical component in daily-use consumer hardware.
The way we read this development, Apple's patience on variable aperture is not creative reluctance. The VCM actuators of 2018 made the implementation expensive, thick, power-hungry, and electrically complicated. The SMA systems available now solve each of those problems specifically. Apple characteristically waits for the engineering to reach the point where it can deliver a feature reliably at scale, and the actuator technology crossing that threshold is what makes the iPhone 18 Pro launch the right moment.
The 180-degree shutter rule is not a stylistic preference. It is the reason film and television footage looks the way it does to human vision, encoding motion blur that matches the natural persistence of what eyes perceive when tracking movement. Shooting at the correct shutter speed produces footage that moves the way a scene should move. In bright daylight, a fixed f/1.6 aperture forces exposure compensation that violates this shutter relationship.
Professional cinema cameras handle this with physical neutral density filters, panels of optically neutral glass that reduce incoming light without affecting color or shutter speed. Variable aperture serves the same function built directly into the lens. Closing the aperture in bright conditions reduces incoming light, allowing the correct shutter speed to maintain natural motion blur without any external accessory.
From what we've gathered during our research, this use case consistently received the strongest validation across every variable aperture smartphone implementation to date. Xiaomi 14 Ultra reviewers repeatedly identified the video exposure benefit as the feature's clearest practical win. The depth-of-field arguments require understanding and intention to use well. The video shutter argument requires nothing from the user: the camera produces better-looking footage outdoors because the mechanics make the correct shutter speed achievable.
The video benefit also intersects with the way Apple Intelligence is likely to be deployed. A well-designed automatic mode recognizes when shutter speed would need to violate the 180-degree rule and adjusts the aperture to compensate, invisibly. The photographer does nothing. The footage improves. This is the kind of hardware-software integration that computational photography cannot provide, because the problem exists before the image data exists, at the moment of capture.
Mark Gurman of Bloomberg confirmed variable aperture as part of the iPhone 18 Pro camera system in his Power On newsletter in early March 2026, joining Ming-Chi Kuo's December 2024 prediction as the two most reliable validators of the feature. The iPhone 18 Pro is expected to launch in September 2026 alongside the iPhone 18, iPhone 18 Plus, and iPhone 18 Pro Max. Gurman characterized the iPhone 18 Pro as an "S"-tier update overall, with variable aperture as the camera headline and Apple's first foldable iPhone taking the broader spotlight at the fall event.
Several uncertainties remain worth acknowledging. Apple is still in late-stage engineering sample testing, and the feature could be dropped if optical performance or manufacturability standards are not met. 9to5Mac's analysis of the supply chain data found the feature may be limited to the Pro Max in the first generation, though Ming-Chi Kuo's December 2024 prediction referenced both Pro models. Neither scenario is confirmed.
What is clear is that Samsung is already responding, actively asking multiple camera module partners to develop variable aperture samples specifically because of Apple's plans, with the Galaxy S27 Ultra as the likely target. The competitive dynamic means variable aperture is about to become a category specification for flagship smartphones, in the same way that multiple telephoto lenses and large sensors became table stakes over the last several years.
Variable aperture is one part of a larger hardware update that also addresses the iPhone's front panel design. The iPhone 18 Pro's Under-Display Camera finally eliminates the Dynamic Island, pairing the main camera upgrade with the cleanest front glass the iPhone has shipped. For buyers evaluating the full upgrade picture, both changes address areas where the current lineup shows its age.
Our interpretation is that Apple's challenge here is less about the hardware, which the actuator technology appears to have resolved, and more about the software layer. Samsung and Xiaomi both introduced variable aperture without making it automatic or intuitive. The majority of iPhone users do not adjust exposure manually and would not recognize an f-number if the Camera app presented one. Apple's history with hardware-software integration suggests the smarter path is an intelligent automatic mode that adjusts aperture in the background, surfacing manual control only for users who seek it. That approach also limits actuation frequency, reducing mechanical wear on the iris mechanism over the device's lifetime.
For serious videographers, variable aperture is a meaningful capability that solves a real, fundamental limitation. For portrait photographers, the benefit is more modest than headlines suggest: the optical depth of field improvements on a small sensor run toward landscape and macro sharpness rather than full-frame-style subject separation. For the majority of iPhone photographers who shoot in automatic mode, the benefit may arrive invisibly as improved outdoor video and sharper landscape captures, delivered without any settings adjustment required.
The technology represents Apple's acknowledgment that certain optical problems cannot be solved after the fact. Variable aperture addresses exposure and depth of field at the moment of capture, where the physics actually live.
Will variable aperture be on the iPhone 18 Pro or only the Pro Max? Current reporting is ambiguous. Ming-Chi Kuo's December 2024 prediction referenced both Pro models, but 9to5Mac's supply chain analysis suggests the Pro Max may receive it first. No source has confirmed the final distribution.
Does variable aperture make Portrait Mode better? Partly. The optical depth of field on a small smartphone sensor has physical limits, so the background-blur improvement is modest compared to what Portrait Mode already achieves computationally. The more meaningful upgrade is that optical depth of field renders edges and hair accurately where Portrait Mode algorithms sometimes produce artifacts.
Can variable aperture replace an external ND filter for video? For standard daylight shooting with natural motion blur, yes. Professional cinematographers using very long exposures or specific creative effects will still use dedicated ND filters. For general iPhone videography, variable aperture performs the same function built into the hardware.
Will the aperture adjust automatically or do I need to set it manually? Apple has not confirmed the control interface. Based on how Apple Intelligence handles similar camera decisions, an automatic mode managing aperture in the background is the most likely implementation, alongside manual control for users who want it.
Is variable aperture available on the standard iPhone 18? No current reporting indicates the standard iPhone 18 will receive variable aperture. The feature is associated exclusively with the Pro and Pro Max models in all supply chain reporting to date.