After many years of designing and selling a variety of different cables to power and charge its devices, Apple has slowly switched to USB-C chargers for all of its products.
The last device to swap is the iPhone, and it happened against Apple’s will. In October last year, the European Commission requested all phones and laptop producers switch to the USB-C connector (which had earlier been agreed on as a common standard).
Apple could have chosen to ignore the request and stop selling in the EU, or to produce versions with USB-C for the European single market only. Instead, it chose to comply and follow the EU rules everywhere. The common charger for all devices is thus becoming a reality, at least until the world moves completely to wireless charging.
A better standard
The Lightning charger was introduced by Apple in 2012 and first featured on the iPhone 5. It was the successor to the 30-pin dock connector introduced in 2003 for the first iPods and iPhones. Arguably, the key visible innovation of the Lightning cable was reversible ends.
This enabled the user to insert the charger into the dock without having to wonder whether it was oriented in the right way. It might seem trivial now, but this was not the case with any other charger. If you are using the standard USB port on your laptop now, you are likely to spend a lot of time plugging the cable in and taking it out in order to find the right orientation. You’re probably also complaining about how inconvenient it is. At least, that’s what I do.
The USB-C connector came out about two years after the Lightning. There was nothing particularly novel or remarkable about it compared to Apple’s cable. However, one notable feature was that it borrowed the Lightning connector’s reversibility.
USB-C is just the name of the connector, not the entire cable. The cable and connector are part of a bigger technical specification called USB-4. USB-4 outperforms Lightning in every technical dimension conceivable. It can transfer data much more quickly: up to 40Gbps (gigabits per second) for USB-4 versus 480Mbps (megabits per second) for Lightning. It also charges devices more swiftly, to the point that Apple started selling Lightning to USB-C adaptors.
The main difference between the two, however, is that UBS-C is not proprietary. It was developed by a consortium called the USB implementer forum. This consortium is composed of companies such as Intel and Microsoft – and also Apple.
All of the USB standards can be used by any business. Apple, on the other hand, does not allow anyone else to use its proprietary accessories, unless they agree on a license. This means that USB-C is compatible with many more devices, including all recent Apple products, but not previously with the iPhone.
When it pays to be different
So, what’s so special about the Lightning connection that made Apple stick with it for so long, despite repeated promises to join its competitors on a common standard? Why would Apple sabotage one of its own phones by keeping a substandard charging connection?
One possibility is that consumers are inattentive when they buy a phone, and do not directly factor in the cost of accessories such as chargers. If this is true, Apple would have needed these add-ons to remain proprietary and make sure no competitor could start offering them for a lower price. If so, forcing Apple to offer the better standard benefits all consumers.
The alternative explanation is that some consumers actually value the Lightning connection more. After all, the look is different, and Apple fans argue that it may be harder-wearing than other standards. It is also a signal of status and exclusivity.
We seem to have reached a stage in the market for smartphones where people who only care about everyday usage replace their device much less often. This is probably because technology is not evolving at the same pace it did in the past. Yet, it’s also the case that demand for high end phones continues to increase.
This could be because they cater to a subset of consumers who either greatly value a marginally higher quality of camera or slightly bigger storage. But mostly, expensive phones are a way to signal social status.
People buy the latest phone not only because they want to own it, but also to be seen as owning one. This is certainly a factor that has helped Apple thrive because the company offers products that are visibly different from the cheaper alternative. And another sign of status for Apple users is having different accessories, including the proprietary chargers.
Apple has not always been so keen to reject common standards. Not only is it one of the participants in the USB consortium, but it is also the company that helped USB become the global standard by offering it on its first generation of iMacs.
At the time, however, Apple was the underdog in the market for personal computers, facing off against the tech giant Microsoft. And a big reason why many people did not buy Apple computers at the time was their fear they would not be compatible with Microsoft products.
At one point, Apple even went as far as developing tools to help users run Windows on their devices. At the time, it made sense to try to make your products as compatible as possible with those of the market leader.
In today’s smartphone market, Apple is a leader, and may gain from not being compatible with other standards and products. The big question, however, is whether consumers benefit. If exclusivity is a way to block competition, then they probably don’t. If consumers value exclusivity, or if it encourages Apple to innovate, then perhaps forced standardisation is not such a great idea.