US Congress targets Apple over iPhone slowing

Apple is set to face a grilling in the US Congress over allegations it slowed older phones to encourage consumers to purchase newer models.

The head of the Senate’s commerce committe, Senator John Thune, has written to Apple’s chief executive Tim Cook to demand an explanation for why Apple slowed down phones with flagging batteries and why it would not provide free batteries if older ones caused so much difficulty to devices.

Mr Thune’s letter follows confirmation from French prosecutors that an investigation is being fronted by the finance ministry’s fraud control department after Apple admitted slowing down old devices with low-capacity batteries.

“Apple’s proposed solutions have prompted additional criticism from some customers, particularly its decision not to provide free replacement batteries,” Thune said, according to a report in the Wall Street Journal.

Apple revealed it had “downclocked” older models’ central processing units (CPU) but said it did so to reduce the strain on dated batteries and stop the devices from unexpectedly shutting down.

The iPhone X
Image: Apple recently released the iPhone X

At the time, the company asserted it would never “do anything to intentionally shorten the life of any Apple product, or degrade the user experience to drive customer upgrades”.

Amid a wave of class action lawsuits over the “deceptive, immoral, and unethical” phone slowing, Apple issued an apology and vowed to be more transparent with customers over the capacity of iPhone batteries.

In a post on its website, it said: “We know that some of you feel Apple has let you down. We apologise.”

The admission came after years of speculation from Apple customers that their older handsets were being slowed down in a bid to entice them into an upgrade.

The iPhone X’s notch is basically a Kinect

Sometimes it’s hard to tell exactly how fast technology is moving. “We put a man on the moon using the computing power of a handheld calculator,” as Richard Hendricks reminds us in Silicon Valley. In 2017, I use my pocket supercomputer of a phone to tweet with brands.

But Apple’s iPhone X provides a nice little illustration of how sensor and processing technology has evolved in the past decade. In June 2009, Microsoft unveiled this:

In September 2017, Apple put all that tech in this:

Well, minus the tilt motor.

Microsoft’s original Kinect hardware was powered by a little-known Israeli company called PrimeSense. PrimeSense pioneered the technology of projecting a grid of infrared dots onto a scene, then detecting them with an IR camera and acsertaining depth information through a special processing chip.

The output of the Kinect was a 320 x 240 depth map with 2,048 levels of sensitivity (distinct depths), based on the 30,000-ish laser dots the IR projector blasted onto the scene in a proprietary speckle pattern.

In its day, the Kinect was the fastest selling consumer electronics device of all time, while it was also widely regarded as a flop for gaming. But the revolutionary depth-sensing tech ended up being a huge boost for robotics and machine vision.

In 2013, Apple bought PrimeSense. Depth cameras continued to evolve: Kinect 2.0 for the Xbox One replaced PrimeSense technology with Microsoft’s own tech and had much higher accuracy and resolution. It could recognize faces and even detect a player’s heart rate. Meanwhile, Intel also built its own depth sensor, Intel RealSense, and in 2015 worked with Microsoft to power Windows Hello. In 2016, Lenovo launched the Phab 2 Pro, the first phone to carry Google’s Tango technology for augmented reality and machine vision, which is also based on infrared depth detection.

And now, in late 2017, Apple is going to sell a phone with a front-facing depth camera. Unlike the original Kinect, which was built to track motion in a whole living room, the sensor is primarily designed for scanning faces and powers Apple’s Face ID feature. Apple’s “TrueDepth” camera blasts “more than 30,000 invisible dots” and can create incredibly detailed scans of a human face. In fact, while Apple’s Animoji feature is impressive, the developer API behind it is even wilder: Apple generates, in real time, a full animated 3D mesh of your face, while also approximating your face’s lighting conditions to improve the realism of AR applications.

PrimeSense was never solely responsible for the technology in Microsoft’s Kinect — as evidenced by the huge improvements Microsoft made to Kinect 2.0 on its own — and it’s also obvious that Apple is doing plenty of new software and processing work on top of this hardware. But the basic idea of the Kinect is unchanged. And now it’s in a tiny notch on the front of a $999 iPhone.

A week ago, Apple debuted iPhone X and Face ID, a new biometric security mechanism that replaces Touch ID.

Face ID allows users to unlock their iPhone with their face. The same mechanism can also be used to make purchases in various Apple digital media stores, and to authenticate payments via Apple Pay.

The mechanism works by projecting over 30,000 infrared dots onto a face and creating a 3D mesh of it, then comparing it to the stored facial recognition information.

Security and privacy concerns addressed

The facial recognition information is stored on the device, in a Secure Enclave on the Apple A11 Bionic chip – it is never stored in the cloud, shared with third-parties, or sent to Apple, meaning that Apple can’t hand over such information to law enforcement or anyone else.

And, as Craig Federighi, SVP of Software Engineering at Apple, confirmed for Tech Crunch, the information is stored as a mathematical model that cannot be reverse-engineered, i.e. the information can’t be used to created a model of the face.

Apple is confident that the mechanism can’t be fooled by photos or masks. I have no doubt that many hackers will try to prove them wrong once they get their hands on a newly minted iPhone X, but it remains to be seen whether they’ll succeed.

How to disable Face ID

Apple has implemented easy methods for thwarting Face ID if the user is ever in a position of being forced to unlock the device without actually wanting to:

  • They can either simultaneously press the side buttons (volume+power) on either side of the device and hold them a little while. This will take them to the power down screen but it will also disable Face ID)
  • Refuse to stare directly at the iPhone. Face ID won’t work if the user doesn’t stare at the device, (unless the “attention detection” feature has been turned off – the option is provided to help people who are blind or vision impaired).

The fact that a user has to look at the phone for Face ID to work also means that people close to the user can’t unlock the device by putting it in front of the user’s face while he or she is sleeping.

As Touch ID before it, Face ID will default back to passcode if there have been five failed attempts to Face ID, or if the device has been rebooted. The device will also ask for a passcode if the user hasn’t used Face ID in 48 hours.

For the moment, there is no option for using both Face ID and passcode to double down on security.

So, if you’re less worried about security, Face ID is the more convenient choice. If the opposite is true, use passcodes. And if you’re not sure which option is the best, this post by security researcher Troy Hunt can help you decide.

This AI thinks Steve Jobs and Tim Cook have the same speechwriter

Apple has tried its best to channel Steve Jobs’ showmanship during its recent presentations, as seen during the company’s recent iPhone X launch. But perhaps it goes deeper than that: an AI speech analyzer thinks Steve Jobs and Tim Cook actually have the same speechwriter.

There’s plenty of reason to doubt the claim, but it’s an interesting finding nonetheless. The AI, called Emma, was created by the developers of Unicheck, a site for detecting plagiarism in college papers and such. Emma uses self-learning algorithms based on natural language processing to analyze a writer’s style, and the site’s press kit claims Emma has been tested to 85 percent accuracy.

It works simply enough. You feed the AI 5,000 words to analyze a writer’s style. Once that’s done, you only need to upload 200 different words to determine if they were written by the same person or not. Emma’s team plugged in Steve Jobs’ 2005 Stanford commencement speech and iPhone 4 keynote, and compared it to Tim Cook’s 2016 iPhone Event, his WWDC 2017 presentation, and this week’s iPhone X event.

For the first two, the AI was ‘100 percent’ sure they were written by the same person. For this week’s iPhone event, the AI was 86 percent sure.

That would be a pretty cool finding if true, but there’s reason to be skeptical. Jobs was known to have asked for help from screenwriter Aaron Sorkin for the Stanford speech, but Sorkin says he only “fixed a couple of typos.” Apple execs have definitely had speech writers in the past, but Jobs’ official biography suggests that he had always written his own presentations. Unless the biography is wrong, something changed starting with the Stanford speech, or Jobs has become a literal ghost writer, the pieces don’t quite add up.

More importantly, the AI just doesn’t get it right all the time. I plugged in 5,000 of my own words and compared them to TNW’s other writers, and the AI was correct around 80 percent of the time. That’s in line with the company’s accuracy claim, but notably, the AI always thought my colleague Abhimanyu Goshal’s pieces were my own – in one case with ‘100 percent’ certainty. Incidentally, Abhi and I tend to cover the most similar topics.

While my preferred explanation is that Siri is an ancient AI that has been writing Apple’s speeches all along, my guess is Apple has simply done its best to maintain Steve Jobs’ style and spirit. After all, Jobs’ keynotes had the power to define entire generations of technology; there’s a reason the Steve Jobs movie – written by Sorkin, by the way – is structured to take place during three product launches. Like they say – if it ain’t broke, don’t fix it.

Apple wants appmakers to avoid the notch when designing for iPhone X

During its annual special event yesterday, Apple lifted the lid on its all-new iPhone X – a handset it labels the future of smartphones. But it seems the future will be a little more of a hassle for developers and designers, than it will be for consumers.

The iPhone X represents a significant departure from the iterative form factors we’ve come to expect from Apple, introducing an enhanced (almost) edgeless OLED display, a new FaceID facial recognition technology and a rounded all-glass design. But perhaps its most eye-catching aspect currently remains the notch at the top of the screen.

It certainly was one of the highlights on the Twitterverse, attracting a pile of doubting eyes and cheeky wisecracks mostly oriented at the awkward challenge it presents to appmakers:




The Big A has released its Human Interface Guidelines for building apps for the X – and, indeed, it appears appmakers will have a few more things to worry about when developing for the new flagship.

Unlike the 6, 7 and 8 which sport 4.7-inch panels, the X is not only a little taller (145 points to be exact), but also has rounded corners. This means that if creators continue to follow the old standards when building apps for the new flagship, some areas of the screen might remain unused – or the notch might hide the app interface.

With this in mind, the Cupertino giant instructs appmakers to “ensure that layouts fill the screen and aren’t obscured by the device’s rounded corners, sensor housing, or the indicator for accessing the Home screen.”

Among other things, this includes making sure that interactive controls don’t appear at “at the very bottom of the screen and in corners,” that the interface doesn’t bring attention to the device’s rounded corners, and that the user interface isn’t clipped by the notch.

Here is how that might look like

The good thing is that, according to official Apple footage (you will need Safari to open it), not every app will require custom tailoring for the X – and even if it does, the company’s new UIKit and XCode 9 software should make this a fairly easy task.

We already know mobile carriers are going to hate the notch, time will tell whether this will be the case with developers too.

Meanwhile, those interested can peruse the full guidelines here.


Apple introduces ‘Face ID’ for iPhone X

Apple today confirmed the iPhone X will feature facial recognition technology dependent upon a new “true depth camera system” including an infrared camera, flood illuminator, dot projector, and proximity sensor.

The Face ID feature will work in low light situations and, according to Apple, there’s only a one-in-a-million chance another face can unlock your device, unless you have an evil twin (according to Apple, if you do have an evil twin you should use passcode security).

Every time you look at your iPhone X, it detects your face. The feature will allow users to unlock devices simply by looking at their phones and swiping up, essentially letting iPhone owners to use their face as a password.

The device features a built-in neural engine to perform the computational heavy lifting required to process users’ faces. This allows the iPhone X to learn your face and increase security. It will also enable to the device to compensate for things like changing your hair color, growing a beard, or getting glasses.

The Face ID feature will work with Apple Pay, and has third party developer support as well. It’ll be interesting to see how the technology is used in non-Apple apps.

Apple is about to piss off every mobile carrier with the iPhone X

Apple just unveiled its flashy new iPhone X at the keynote event in Cupertino, but there is one thing about the device that is likely to upset every mobile carrier out there: The carrier name will no longer be visible in the status bar.

While the name won’t be entirely gone, users will now have to swipe down to see to which network they are connected to. This is admittedly a change that will likely only rub mobile carriers the wrong way as users probably already know the mobile carrier they’ve subscribed to.

The change was confirmed in video demonstration shown during today’s iPhone keynote.

Here is how the status bar will look on the X. As you can observe, the carrier name no longer appears on the screen by default:

The possibility the Cupertino titan might remove remove the carrier name from the status bar was first brought up by serial Apple leaker Steve Stroughton-Smith, who suggested this was a compromise the company resorted to to make the device’s new notch design more appealing to users.

To check out the mobile carrier, users will be able to pull down the left side of the screen. As Stroughton-Smith explains, both ear areas of the notch design will be independently swipeable.


Good riddance, carrier names in status bar, you won’t be missed by us, the consumers.

Apple yesterday announced Animoji, animated emoji which use the new FaceID on the iPhone X to allow users to customize their own emoji.

The Animoji uses the X’s TrueDepth camera system to track more than 50 facial motions in real time. You can pick a dozen different emoji to customize, including the panda, unicorn, robot, and of course poop emoji.

When you record your Animoji, it will play your voice back along with the moving emoji — meaning we had the pleasure of watching Apple’s senior VP of software engineering Craig Federighi making animal noises on stage. They can also be used as stickers.

While some information about the Animoji leaked ahead of the show today, we didn’t know which emoji would be included until we were given a glimpse at the show. The Animoji will be available in iMessage as an app.


Live Apple Event – Apple September Event 2017 – iPhone 8, iPhone X, iOS 11 – Apple Keynote
Apple Special Event 2017.
Apple’s new iPhone 8 will be officially unveiled today. The latest smartphone is expected to be a radical change for the company, which will seek to wow users for the 10th anniversary of the device.

Apple’s iPhone X event: How to watch live

Apple is about to kick off its most important event in years.

The company is expected to announce new iPhones — including a high-end device, potentially called the iPhone X — plus a new Apple Watch and Apple TV device, and maybe some more information about its HomePod speakers. It’s the first-ever press event at the company’s new Steve Jobs Theater, part of its recently opened Apple Park “spaceship” campus in Cupertino, Calif.

The keynote kicks off today, Tuesday, Sept. 12 at 10 am PT, 1 pm ET. (That’s 6 pm in London and 1 am Wednesday in Hong Kong.) Recode will be in attendance, and will cover the event — click here for the latest. Other live coverage options include The Verge and Six Colors.

Apple plans to livestream the event on its website and via its devices, but you’ll need to use one that’s compatible with its livestreaming technology. These include: “iPhone, iPad, or iPod touch with Safari on iOS 9.0 or later, a Mac with Safari on macOS v10.11 or later, or a PC with Microsoft Edge on Windows 10.” You can also use an Apple TV streaming device, provided it’s a second- or third-generation device with software 6.2 or later or the latest, fourth-generation Apple TV.