Let me tell you something that’s been bugging me for years. Every time I unlock my iPhone, I see that Dynamic Island sitting there at the top of the screen, and I think: “This is supposed to be cutting-edge technology?” Don’t get me wrong the Dynamic Island is clever software design. But physically? It’s just Apple’s way of making the best out of a technological limitation they haven’t been able to solve. Until now, maybe.
The iPhone 18 Pro, coming in September 2026, is rumored to finally crack one of the most difficult engineering challenges in smartphone history: hiding Face ID’s complex sensor array completely under the display. And after digging into what’s actually involved in making this work, I’ve realized something important this isn’t just about making phones look prettier. This is about solving problems that have stumped engineers at Apple and across the entire industry for nearly a decade.
Why This Has Taken So Freaking Long
Here’s what most people don’t understand: Face ID isn’t just a camera that looks at your face. It’s an incredibly sophisticated 3D mapping system that projects 30,000 invisible infrared dots onto your face, captures how they reflect back, and builds a depth map accurate enough that the chances of someone else unlocking your phone are literally one in a million.
Think about what that means for a second. Your iPhone is doing real-time 3D scanning of your face every single time you glance at it, and it needs to work flawlessly whether you’re standing in bright sunlight, sitting in a dark room, wearing glasses, or sporting a new hairstyle. That’s insanely complex technology, and every single component needs an unobstructed path to do its job.
Now try putting all of that underneath a screen. An OLED screen. One made up of millions of individual pixels, each one capable of blocking or scattering the very infrared light that Face ID depends on.
See the problem?
This is why we’ve been stuck with notches since the iPhone X in 2017, and why the Dynamic Island has been Apple’s compromise solution since 2022. It’s not because Apple’s engineers are lazy or lack creativity. It’s because the physics involved are genuinely difficult to overcome.
The Technical Nightmare Apple Has Been Wrestling With
Let me break down the specific challenges that have delayed this technology for years, because understanding these helps explain why 2026 might finally be the year.
Challenge #1: Infrared Light Doesn’t Play Nice With OLED Displays
Face ID’s TrueDepth system relies on infrared light specifically, it uses a dot projector to spray thousands of IR dots onto your face, and an infrared camera to read them back. The problem? OLED displays are specifically engineered to block, scatter, and diffuse light. That’s literally their job when displaying images.
When you try to push infrared light through an OLED display, it’s like trying to shine a flashlight through a dense fog. The signal gets scattered, weakened, and distorted. The IR dots that need to hit your face with precision get diffused before they even leave the phone. And the reflected dots coming back? They have to pass through the display again, losing even more clarity.
This isn’t a minor inconvenience it fundamentally compromises the accuracy and security of the entire Face ID system. Apple can’t just accept a “good enough” solution here. Face ID needs to maintain its one-in-a-million security standard, or they’re putting millions of users at risk.
Challenge #2: You Can’t Just “Turn Off” Pixels
The obvious solution sounds simple: just turn off the pixels where the sensors need to see through, right? Well, early prototypes tried exactly that approach, and the results were disappointing.
When you selectively deactivate pixels, you create visible artifacts on the screen. Users would see a slight discoloration or transparency in that area, which defeats the entire purpose of hiding the sensors. Plus, deactivated pixels mean you’re still effectively creating a cutout it’s just an invisible cutout instead of a black one.
Apple’s solution, according to recent industry reports, involves something far more sophisticated: removing specific subpixels from the display in ways that won’t be noticeable to users. This maintains display quality while creating invisible windows for the infrared projection system.
Think about the precision required here. We’re talking about selectively engineering certain portions of the display at the subpixel level areas measured in micrometers to allow IR light through while maintaining the visual quality users expect from an iPhone display. That’s not just difficult; it’s borderline miraculous.
Challenge #3: The Hardware Redesign Nightmare
Face ID isn’t a single sensor. It’s an entire system called TrueDepth that includes multiple components: a dot projector, an infrared camera, a flood illuminator, and the regular front-facing camera. Each of these components has specific optical requirements and needs to be positioned precisely relative to the others.
Moving these components under the display means redesigning literally everything. The dot projector needs to project through the screen. The IR camera needs to receive through the screen. The software algorithms that process the facial data need to compensate for the signal loss and optical interference introduced by having a display in the way.
And here’s the kicker: Apple can’t compromise on speed or accuracy. Face ID currently unlocks your phone almost instantly. Users expect that. If under-display Face ID takes twice as long or fails more often, people will hate it, regardless of how good it looks.
The Breakthrough: Spliced Micro-Lens Glass
So how is Apple finally solving this? According to technical reports from their supply chain, Apple is testing a specific material solution known as “Spliced Micro-Lens Glass”. This isn’t just fancy marketing speak it’s a genuinely innovative approach to a difficult problem.
The basic concept involves modifying the glass above the sensor array with micro-lenses—tiny optical elements that guide light through the display pixels with minimal signal loss or distortion. Think of them as precision tunnels for infrared light, carefully engineered to maintain the integrity of the IR signals while keeping the display looking uniform to the human eye.
Here’s what makes this approach clever: instead of trying to make the display fully transparent to infrared light (which is basically impossible), or creating visible cutouts (which defeats the purpose), Apple is creating carefully engineered pathways that are transparent to IR but invisible to the human eye.
The engineering tolerances on this must be insane. We’re talking about manufacturing precision at the microscale, integrated into a component that also needs to be durable enough to survive years of daily use, temperature fluctuations, and the occasional drop.
What This Means for the iPhone 18 Pro Design
If Apple successfully pulls this off for the iPhone 18 Pro in September 2026, the front of your iPhone is going to look dramatically different. And I mean dramatically.
Based on the most credible reports, here’s what we’re looking at: instead of the current Dynamic Island that pill-shaped cutout that’s been standard since the iPhone 14 Pro you’ll have just a small hole-punch camera in the top-left corner. That’s it. No visible Face ID sensors. No infrared components interrupting your screen. Just one tiny circular cutout for the selfie camera, smaller than what you see on most Android phones.
Apple’s new under-screen Face ID system will sit next to a top-left camera cutout, meaning the Dynamic Island software feature might relocate to that corner, or potentially disappear entirely depending on which rumors you believe.
But here’s where it gets confusing, and honestly, this is part of what makes following these rumors so frustrating: nobody seems to agree on what happens to the Dynamic Island itself. Some reports say it’s gone completely. Others say it’ll shrink but stay centered. Still others claim it’ll move to the top-left corner along with the camera.
My best guess? The Dynamic Island as a software feature probably sticks around in some form, because developers have spent two years optimizing apps for it. But its physical manifestation that black pill shape is likely going away on the Pro models.
The Timeline: Why Now and Not Earlier?
You might be wondering: if this technology is so groundbreaking, why is it only showing up in 2026? Apple’s been working on this for years, according to patent filings. What took so long?
The answer comes down to a combination of technical maturity, manufacturing capability, and Apple’s obsessive perfectionism.
First, the display technology needed to advance. Samsung Display, Apple’s supplier, recently developed under-screen infrared technology that would pave the way for under-screen Face ID. This wasn’t available two or three years ago in a form that met Apple’s standards.
Second, the manufacturing processes for creating micro-lens glass and precision subpixel engineering needed to mature to the point where Apple could produce millions of units reliably. Remember, Apple ships over 200 million iPhones per year. Any technology they use needs to scale to that volume without quality issues.
And third and I think this is crucial Apple probably wasn’t willing to compromise on Face ID’s security and reliability. Display analyst Ross Young has been tracking this technology since 2023, and his timeline kept pushing back, with delays attributed to “sensor issues.” Translation: Apple wasn’t satisfied that the under-display version matched the performance of the current system.
This is actually one of the things I respect about Apple, even when it’s frustrating. They could have shipped under-display Face ID two years ago if they were willing to accept slightly worse performance or occasional failures. But they didn’t. They waited until they could do it right.
What It’ll Actually Feel Like to Use
Okay, so the engineering is fascinating, but let’s talk about what this actually means for you as a user. Because that’s what really matters, right?
In ideal conditions good lighting, you’re looking straight at your phone under-display Face ID should work exactly like the current system. Same speed, same reliability, same security. You glance at your phone, it unlocks. No noticeable difference from what you’re used to.
But here’s the reality that nobody wants to talk about: in challenging conditions, there might be subtle differences. And I say this as someone who’s used Face ID daily since the iPhone X launched.
Face ID currently struggles occasionally in very specific situations: extreme backlighting, certain angles, very bright sunlight. These aren’t common problems, but they happen. Under-display Face ID will likely have its own set of edge cases where performance isn’t quite perfect.
Why? Because you’re adding an extra variable the display itself is now part of the optical system. Even with micro-lens glass and carefully engineered IR pathways, that display is introducing some level of interference that isn’t present in the current system where the sensors have a direct, unobstructed view.
Now, I’m not saying it’s going to fail all the time or be unreliable. Apple’s not stupid, and they’re not going to ship something that frustrates users. But expecting absolutely zero difference in every possible scenario is probably unrealistic.
The good news? For 95% of daily use unlocking your phone while sitting at your desk, lying in bed, walking around outdoors you probably won’t notice any difference at all. The edge cases where current Face ID works and under-display might struggle slightly are rare enough that most users will never encounter them.
The Dynamic Island Confusion: Three Competing Theories
I’ve been following Apple rumors for long enough to know when nobody really knows what’s happening, and the Dynamic Island situation for iPhone 18 Pro is definitely one of those cases. Let me lay out the three main theories and what evidence supports each:
Theory 1: Dynamic Island Disappears Completely
The Information, which is usually pretty reliable, reported that iPhone 18 Pro models will have no Dynamic Island with just a pinhole cutout located at the upper left of the display. Their reasoning makes sense: if Face ID is under the display and the camera is in a corner punch-hole, what physical reason is there for the Dynamic Island’s shape?
The problem with this theory? Apple just spent two years convincing developers to optimize apps for the Dynamic Island. Throwing that away feels wasteful, even for Apple.
Theory 2: Dynamic Island Shrinks But Stays Centered
Bloomberg’s Mark Gurman, who has an excellent track record on Apple rumors, claims the iPhone 18 Pro will feature a slimmed-down Dynamic Island rather than removing it entirely. This would make sense if some components still need to be visible, even if most of Face ID moves under the display.
The challenge here is: if only the camera needs a cutout, why keep it in the center? That wastes screen real estate for no reason.
Theory 3: Dynamic Island Moves to Top-Left
This is the wild card theory, and honestly, it’s my personal favorite from a logic standpoint. If the camera is moving to the top-left corner anyway, why not have the Dynamic Island software feature expand from that corner when needed?
Technically, this makes sense. The Dynamic Island is largely software-defined anyway. It could just as easily expand from a corner as from the center. And it would preserve all the work developers have done supporting Dynamic Island functionality.
My prediction? We’ll probably see some version of Theory 3, where the camera punch-hole serves as an anchor point for Dynamic Island animations when needed, but the display feels more full-screen most of the time.
What About the Regular iPhone 18?
Here’s something important to note: all these under-display Face ID rumors specifically reference the iPhone 18 Pro and iPhone 18 Pro Max. The standard iPhone 18 models? They’re keeping the Dynamic Island.
This makes sense from Apple’s perspective. Under-display Face ID is expensive and complex. Making it a Pro-exclusive feature justifies the price difference and gives people a reason to upgrade to the more expensive models.
It’s also a smart way to de-risk the technology. If there are any issues with the first generation of under-display Face ID, they’re contained to the Pro models which represent a smaller portion of total iPhone sales. Apple can iterate and improve the technology before rolling it out to the entire lineup in future years.
Plus, from a manufacturing standpoint, Apple probably can’t produce enough under-display Face ID components to meet demand for all iPhone models in year one. Limiting it to Pro models makes the volumes manageable.
The Broader Industry Impact
Here’s what’s interesting about Apple finally doing this: it’s going to force the entire smartphone industry to up its game.
Android manufacturers have had under-display fingerprint sensors for years, but most haven’t attempted under-display facial recognition at the complexity level of Face ID. Samsung, Xiaomi, and others have tried various under-display camera solutions, but the image quality compromises have been significant enough that adoption has been limited.
When Apple ships under-display Face ID that actually works well, it sets a new baseline for what premium smartphones should offer. Suddenly, having visible camera cutouts or notches is going to feel dated. Other manufacturers will need to match or exceed Apple’s implementation to remain competitive.
I expect we’ll see a wave of under-display facial recognition systems from Android manufacturers in 2027 and beyond, as the technology matures and becomes more accessible. Apple’s entry into this space legitimizes it in a way that helps the entire industry.
September 2026: What to Actually Expect
So let’s bring this all together. In September 2026, when Apple takes the stage to announce the iPhone 18 lineup, here’s what I think we’ll actually see:
The iPhone 18 Pro and Pro Max will feature under-display Face ID using micro-lens glass technology developed in partnership with Samsung Display. The front of the phones will have a single camera punch-hole in the top-left corner, dramatically cleaner than the current Dynamic Island design.
Face ID will work essentially the same as it does now for most users, with the same speed and security. There might be very minor differences in edge-case scenarios, but nothing most people will notice in daily use.
The Dynamic Island software feature will probably stick around in some modified form—either shrinking significantly or relocating to work with the new top-left camera position. Apple won’t completely abandon it because developers have invested time in supporting it.
The standard iPhone 18 models will keep the current Dynamic Island design, making under-display Face ID a Pro-exclusive feature, at least for the first year.
And honestly? I think Apple will nail the execution. They’ve had years to work on this, they’ve waited until the technology is mature enough to meet their standards, and they have more resources than anyone else in the industry to throw at solving these problems.
The Question Nobody’s Asking: Should You Care?
Here’s the thing that’s been nagging at me as I’ve researched this: does any of this actually matter?
Like, yes, under-display Face ID is an impressive technical achievement. Yes, it’ll make the iPhone 18 Pro look more futuristic. But does it meaningfully improve your experience using the phone?
The Dynamic Island works fine. Face ID already unlocks your phone quickly and securely. The current design isn’t really getting in the way of anything important. So are we just chasing aesthetic perfection for its own sake?
Maybe. But I also think there’s value in pushing technology forward, even when the current solution is “good enough.” Every major advancement in smartphones has seemed unnecessary to some people at the time. Did we really need to lose the home button? Did we really need Face ID when Touch ID worked fine? Did we really need larger screens when 3.5 inches was perfectly usable?
In hindsight, all of those changes improved the user experience in ways that weren’t immediately obvious. More screen real estate means more content visible at once. Face ID works with gloves and wet hands when Touch ID doesn’t. Losing physical buttons makes devices more water-resistant.
I suspect under-display Face ID will be similar. Yeah, the Dynamic Island is fine. But when you’ve used a phone with a completely uninterrupted display for six months, going back will feel like a step backward. Just like how using a phone with a notch feels dated now, even though it seemed fine in 2017.
The Real Story: It’s About the Journey, Not the Destination
You know what the real story behind under-display Face ID is? It’s not about the finished product we’ll see in September 2026. It’s about the years of work, the thousands of engineering hours, the countless prototype failures, and the persistence required to solve problems that seemed insurmountable.
It’s about display manufacturers figuring out how to manipulate subpixels at microscale precision. It’s about optical engineers designing micro-lens arrays that can guide infrared light without creating visible artifacts. It’s about software engineers rewriting facial recognition algorithms to compensate for signal degradation. It’s about manufacturing specialists developing processes that can produce millions of these displays reliably.
That’s the real story. The finished iPhone 18 Pro will look simple just a screen with a tiny camera hole. But underneath that simplicity is some of the most sophisticated engineering in consumer electronics.
And honestly? That’s what makes this stuff fascinating to follow, even when we’re obsessing over details that most users will never think about. The gap between “this should be possible” and “we’ve actually done it reliably at scale” is where innovation lives.
What Comes Next
Assuming Apple successfully ships under-display Face ID in the iPhone 18 Pro, what’s next? Where does this technology go from here?
The obvious next step is rolling it out to the entire iPhone lineup. By iPhone 19 or iPhone 20, I’d expect every model to feature under-display Face ID as standard. The technology will become cheaper and easier to manufacture, making it feasible for non-Pro models.
Beyond that? The holy grail is getting the front-facing camera under the display too. Several Android manufacturers have attempted this with limited success the image quality suffers significantly when you’re shooting through a display. But if Apple’s engineers can solve the Face ID problem, the camera problem probably isn’t far behind.
Imagine an iPhone with absolutely nothing visible on the front except screen. No punch-holes, no cutouts, no notches. Just a single, uninterrupted display from edge to edge. That’s the long-term vision, and under-display Face ID is a crucial step toward making it real.
We might also see this technology expand to other Apple products. The iPad Pro could benefit enormously from under-display Face ID, especially with Apple pushing it as a laptop replacement. MacBooks might eventually incorporate it too, though the use case there is less clear given that most people don’t use facial recognition on their laptops.
The Bottom Line: Should You Get Excited?
Look, I get it. We’re talking about a phone that won’t be announced for another eight months, based on rumors and supply chain leaks that might or might not be accurate. Getting too hyped about unreleased technology is a recipe for disappointment.
But here’s what I think: under-display Face ID represents a genuine technological breakthrough, not just an incremental improvement. If Apple pulls it off successfully, it’ll be one of the most significant iPhone upgrades in years.
Will it change your life? No, probably not. Your current iPhone unlocks your face just fine. But it’ll make the iPhone feel more modern, more refined, and more in line with the futuristic vision we all have for what smartphones should be.
And maybe that’s enough. Maybe pushing the boundaries of what’s possible, solving difficult engineering problems, and creating devices that feel slightly magical is worthwhile even when the practical benefits are subtle.
September 2026 is going to be interesting. Whether under-display Face ID lives up to the hype or has teething problems that need to be resolved in future generations, we’re watching the smartphone industry take another step toward that sci-fi future we’ve been promised for years.
And honestly? After nearly a decade of waiting, I’m ready to see if Apple can actually pull this off.


Leave a Reply