Built like tanks in designer suits. There is a reason why iPhones tend to last longer than most Android phones. And it is a similar theme for MacBooks compared with Windows laptops (I can attest with a 2012 MacBook Pro Retina, the first generation, that still works perfectly). And also Apple iPads, compared with any convertibles running Android or Windows platforms. I’ll get to the why in a moment, but first a snapshot of the landscape. Lower failure rates have been reported for iPhones over the years, though Android phones (certain ones, Samsung and Google, being most prominent) have closed the gap in terms of reliability. Multiple studies and experiences relayed by users worldwide over the years, point to that trend.
Whilst I was in Cupertino for the WWDC 2025 keynote and the briefings that followed, Apple gave us access to its otherwise secretive Durability Labs. A rare glimpse of the ruggedness tests that every Apple product undergoes. iPhone, iPad, Mac, even batteries. I can talk about some of that, while a lot of it remains under wraps — but there is a definitive angle to real-world usage being replicated, including environments that may be freezing, to a hot desert-like location. Those fine differences may mean everything in the cutting-edge evolution that we see in the tech space. Every test, even the ones repeated, felt like a dagger through the heart of someone who absolutely chastises folks who keep the adapter atop the lid of a closed laptop!
- The three tests (I have a some handy content on Instagram) that we can talk about include the IPX Chamber, the Random Drop and the Vibration Table. The IPX Chamber, visually very interesting, simulates rain or water spray conditions to assess the water resistance of a product and whether it meets specific water protection levels. A jet of water, doesn’t bother an iPhone, and neither does being immersed — that is always good to know, though I’d still not dunk it in a pool.
- The Random Drop test simulates a variety of randomised, accidental drops that a product may experience in everyday use. These are often at an angle, and Apple’s array of high-resolution cameras give us a rare glimpse into how a MacBook flexes when it falls, or how an iPhone bounces after impact on a hard ground surface. The ones amongst us, who tend to be clumsy out of absentmindedness or are naturally gifted that way, would have already derived benefits from this test.
- This is, as also in the world of automobiles where tests are designed to try and shake it to see how well the build quality holds. In the case of Apple’s version, the Vibration Table simulates a series of shaking environments and impacts that a MacBook or iPad, for instance, may have to absorb during transit or other real-world usage.
From the tests that I can show you, and from the tests that remain a secret, it is clear Apple isn’t approaching this array from a standpoint of pure data. Using an iPhone or an iPad in the high-humidity locales of South East Asia, a drop on the marble floor or the pavement as you rush through a day at work, or finding an iPhone is absentmindedly dropped into a pool of water, all very real-world incidents that Apple hopes to build products that are capable of withstanding. And they are. Apple’s hardware may look incredibly sleek, but the tests they are subject to are anything but that. Which is perhaps why the iPhone, iPad or Mac that you use, are more rugged than the contours may initially betray. Turns out, I may not be straining the metaphor when I say, Apple’s durability labs are a crucible that forge resilience into every device.
FIREFLY, ON THE FLY
Having covered the evolution of Adobe’s Firefly AI consistently over time, there remained a persistent feeling that the company was missing a trick by not having a Firefly app to put all that AI in a simple to use interface. That too, at a time, when the likes of OpenAI and Google were doing more than their bit for AI-enabled content creation. That has finally changed, with the Firefly app now available for iPhones as well as Android devices. The tools that are available, make for a rather interesting cocktail. Turn a text prompt into a video clip up to 5 seconds in duration and in 1080p resolution, text to image as well as generative fill and generative expand that had already found space in Adobe’s popular creative tools including Photoshop.
“Our goal with Firefly is to deliver creators the most comprehensive destination on web and mobile to access the best generative models from across the industry, in a single integrated experience from ideation to generation and editing,” says Ely Greenfield, Adobe’s Senior Vice President and Chief Technology Officer. If you are thinking Adobe’s simply pushing a case for its own Firefly models (of course it is) and not much else, you’d be very, very wrong.
A quick journey through the new Firefly app makes it clear that there’s a wide choice of models that are available for users. There’s Adobe’s own Firefly Image 4 Ultra (likely exclusive to paying subscribers), Firefly Image 4 and Firefly Image 3. And then there are the third-party models, including Google’s Imagen 4, Imagen 3 and Veo 2, as well as OpenAI’s GPT Image. Speaking of subscriptions for a moment — it’ll be ₹499 per month or ₹4,999 per year for 750 monthly credits, unlocking 7 seconds duration for video generations and 100GB storage. My belief is, this is separate from the Creative Cloud subscription. However, content generations get synced based on the Creative Cloud account, which provides a level of link with apps you may already be paying for, including Lightroom and Photoshop.
The Firefly app also gets seamless integration within Adobe Express, which makes it immediately available and relevant for a wide variety of usage scenarios — basically everything Express can do with a consumer focused approach, such as an image editor, resizing assets, brand assets and video editing.
There’s more: Adobe confirms that expanded its Firefly generative AI ecosystem will be integrating models from Ideogram, Luma AI, Pika and Runway, alongside existing models from OpenAI, Google and Black Forest Labs. This will mean creators will get a much broader choice of generative styles, aesthetics and tech specifics such as formats and resolutions. The company confirms that models from these new partners are launching first in Firefly Boards, a product for teams, and will soon be available within the Firefly app.
Charting Adobe Firefly’s progression…
VOICE OF REASON: AI AND INTERNS
This is interesting. Artificial intelligence companies cannot seem to get tired of telling us that AI is already almost everything it needs to be, and will only get better. AI agents. Artificial general intelligence. Thinking models. Reasoning models. And so on. Then bolt from the blue, comes OpenAI CEO Sam Altman’s statement at the Snowflake Summit a week ago. “Today, AI is like an intern that can work for a couple of hours, but at some point it’ll be like an experienced software engineer that can work for a couple of days.” Some dose of realism, and as always, a generous dollop of optimism.
The big question, based on what this statement gives away is — have corporations and businesses around the world been enticed to replace (perceived to be costlier) human employees with (perceived to be more cost effective) AI models which are limited in actual capabilities at best? At least, once a customer, AI companies would hope these ‘customers’ would stick around even when AI becomes better and the price of AI subscriptions go up in the times to come?