Linking to Business Insider’s coverage of this story.
Vic Gundotra, former Google SVP of Engineering, posted the following on Facebook yesterday in reference to recent shots taken of his family with an iPhone 7 Plus:
The end of the DSLR for most people has already arrived. I left my professional camera at home and took these shots at dinner with my iPhone 7 using computational photography (portrait mode as Apple calls it). Hard not to call these results (in a restaurant, taken on a mobile phone with no flash) stunning. Great job Apple.
In a later comment, he goes on to explain why other phones trail behind. Here it is in its entirety, because it’s fantastic:
Here is the problem: It’s Android. Android is an open source (mostly) operating system that has to be neutral to all parties. This sounds good until you get into the details. Ever wonder why a Samsung phone has a confused and bewildering array of photo options? Should I use the Samsung Camera? Or the Android Camera? Samsung gallery or Google Photos?
It’s because when Samsung innovates with the underlying hardware (like a better camera) they have to convince Google to allow that innovation to be surfaced to other applications via the appropriate API. That can take YEARS.
Also the greatest innovation isn’t even happening at the hardware level - it’s happening at the computational photography level. (Google was crushing this 5 years ago - they had had “auto awesome” that used AI techniques to automatically remove wrinkles, whiten teeth, add vignetting, etc… but recently Google has fallen back).
Apple doesn’t have all these constraints. They innovate in the underlying hardware, and just simply update the software with their latest innovations (like portrait mode) and ship it.
Bottom line: If you truly care about great photography, you own an iPhone. If you don’t mind being a few years behind, buy an Android.
Damning words by Gundotra. If you have ever scoffed when Tim Cook says “this is something only Apple can do”, remember this post. It all goes back to owning as much of the technology stack as possible (hardware and software). As Gundotra points out, Apple has virtually no limitations when it comes to innovating because of this. Also for good measure, and because it’s so true, here’s Alan Kay’s legendary quote: “People who are really serious about software should make their own hardware.” Google has just begun to do this with their Pixel phone line, but they’ve got a long way to go to execute at the level Apple does.
Horace Dediu for Asymco:
The graph shows a high degree of consistency of pattern: Every year a new iPhone is launched which replaces the one launched the year before. The older product is still offered at a reduced price. Price brackets are very firm and set at fixed intervals about $100 apart.
The “floor” of the range is a consistent $400 while the “ceiling” has expanded from $700 to about $950.
This year’s ceiling is due for the fourth leg up and if the pattern persists, we should expect it to reach $1100.
Definitely check out the whole post. There’s some excellent graphs and data-driven logic. The price lines up with Gruber’s thought process as well. With growing analysis, it seems inevitable that the iPhone 8 will be the most expensive iPhone when it launches. The only thing nobody can decide on is what it will actually be called.
The iPod Touch is now the only remaining iPod sold by Apple, after the Nano and Shuffle were unceremoniously discontinued yesterday. It truly is the end of an era for the longtime king of MP3 players.
My first iPod (3rd generation) 1 was a lot of firsts for me. It was my first MP3 player, first Apple device, and first gadget I ever lusted after. I only had it because my grandfather promised to buy me something when I made the honor roll in high school. I remember not knowing what I wanted at the time, until suddenly it hit me: iPod.
Even then, I didn’t know why I felt so compelled to have one. I was still developing my own taste in music and didn’t even have a CD player, let alone an inferior MP3 player. Still, the third generation iPod was such a cool, mesmerizing gadget that it almost didn’t matter if I had songs to load on it or not — I just wanted the damn thing. Thankfully, my Dad is a music aficionado and had a budding MP3 collection at the time, so I wasn’t too worried about that.
I remember scouring the Apple website and the whole internet for every detail I could find about the iPod. 2 I was obsessed with how freaking cool it was with its red, glowing buttons and static touch wheel. I remember vividly going to the Apple site a million times from the school library to play with how I wanted the back engraved. Remember how novel that was?
Once I got it, I remember being the first person on campus with an iPod, which drew the attention of my friends, peers, and teachers. Some were even worried it would get stolen, but iPods started becoming commonplace not too long after.
I remember sitting in English class with my then best friend, sharing the old Apple earbuds while listening to Queen’s Another One Bites The Dust and Under Pressure. 3
I remember purchasing the fourth generation iPod while on vacation in Palm Desert shortly after it came out. I called the nearest Best Buy and had my Dad drive me over to buy it. 4 I was in awe of the new click wheel and blue-tinged screen.
I remember the fifth generation iPod, which featured a color screen and the ability to play videos. I didn’t play many videos on it, but that color screen was so impressive at the time.
I remember the iPod Classic, which was just the epitome of the greatly-designed iPod. I also remember when it was killed off in 2014.
I remember the original iPod nano, which was just absurdly small. Loved that little guy. I think the fourth generation nano was the best design, though. The tall screen and click wheel was just SO perfect.
I remember carrying an iPod Touch and original iPhone at the same time because I didn’t want to kill my iPhone’s battery by playing music.
Shortly after, I remember the first time I no longer needed a standalone iPod (when the iPhone 3GS came out). Just like that, I cast away an old friend and adopted its true successor. Now, ten years after the introduction of the iPhone, the last remaining real iPods are no more. They paved the way for so many things: for the modern Apple of course, but also for computing, for music, for PC to Mac switchers 5, and for the world.
And now, I remember the plethora of Apple products I’ve bought over the years thanks to the iPod hooking me into the Apple ecosystem.
So thank you for everything, iPod. Your name lives on in other Apple music products (i.e. HomePod, AirPods), and your legacy will live on inside the iPhone’s Music app.
Christina Passariello from the Wall Street Journal has a great piece centering on Jony Ive and Apple’s new headquarters, Apple Park.
You definitely want to settle in for the long read linked above, but here’s a few highlights.
Throughout the article, Ive compares the planning, design, and creation of Apple Park to the same as any other Apple product, such as:
Ive’s characteristically understated reaction—“It’s nice, though, isn’t it?”—masks the anxiety he feels each time a product he’s designed is about to be introduced to the world. “There’s the same rather strange process you go through when you finish a product and you prepare to release it—it’s the same set of feelings,” says Ive, who turned 50 in February. “That feels, I don’t know, encouragingly healthy, because I would be concerned if we lost that sense of anxiety. I think that would suggest that we were not as self-critical, not as curious, not as inquisitive as we have to be to be able to be effective and do good work.”
On Apple Park being the workplace for future Apple employees:
[…] At the same time, he promises it will be the birthplace of new toys and tools the rest of us haven’t imagined yet. Ive and Tim Cook, Apple’s chief executive, talk about the campus as something for the next generation of Apple employees—like parents doing estate planning.
This is really well said. Tim, Jony, and other execs clearly are looking towards the future when it comes to Apple Park. Our children will build the next great Apple products in this facility.
Christina goes on to say that Jony essentially needs to prove Apple hasn’t stagnated, saying:
[…] In other technologies, from digital assistants to driverless vehicles to augmented and virtual reality, Apple seems to lag other tech giants, including Google, Amazon and Tesla. Its new voice-activated speaker, HomePod, unveiled in June, will arrive on the market in December, three years after Amazon’s Echo. […]
Queue the tired “Apple is behind everyone” trope, with a surprise appearance from Tesla for some reason.1 Also, I would argue HomePod is a tangential competitor to the Echo, and a direct competitor to Sonos.
Tim Cook on managing Apple’s growth:
“We didn’t plan our growth, and then when we saw our growth, we were so engrossed in trying to push things forward that we didn’t spend time to really develop the workplace,” says Cook. “We’ve done a really good job of working around it, but it’s not the way we want to be working, nor does it represent our culture well.”
I can relate to this. Working for an extremely large, national company, with specialty teams can make for difficult collaboration at times.
A cool note about how the AirPods design was inspired by Stormtroopers:
When J.J. Abrams was working on Star Wars: The Force Awakens, Ive mentioned that he “would love to see a lightsaber that is rougher, spitting sparks,” Abrams says. The director, who says he and Ive were already fans of each other’s work when they met at a dinner four years ago, applied Ive’s suggestion to character Kylo Ren’s weapon. “His lightsaber was as imperfect and unpredictable as the character,” says Abrams. (The inspiration is mutual: Ive told Abrams that he had the look of the original Stormtroopers in mind when he designed Apple’s earbuds.)
More on architecture as a product from Ive:
Architecture is “a sort of product design; you can talk about it in terms of scale and function and materials, material types,” he says. “I think the delineation is a much, much softer set of boundaries that mark our expertise.”
Mark Newson on design and a token “Apple Car” reference:
“We always joked that one of the greatest sources of our inspiration was the fact that there was just so much stuff out there that we didn’t like,” says Newson. “The negativity sort of became a positive source of inspiration.” Newson says that Ive’s hand could improve a plethora of badly designed products beyond technology, such as cars—though he says he has no idea if Apple is working on a car.”
Apple Park was almost shaped like a fidget spinner:
The desire for light and air, crossed with the need for enough density to house 12,000 employees, gave shape to Apple Park’s main building. Ive, tracing an infinity sign in the air, says they considered complex forms, including a trilobal design, a sort of giant fidget spinner. Ultimately they decided that only a ring shape could give the feeling of being close to the elements.
On the design of work pods:
The first prototype was ready in the summer of 2010, with pictures of trees on either end of the central area to evoke the landscaping and proximity to the outdoors. Jobs himself set the precise dimensions of the openings from one end of the central area to the other. The team quickly discovered that early versions of the small offices on each side of the central area were noisy—sound bounced off the flat wood walls. Foster’s architects suggested perforating the walls with millions of tiny holes and lining them with an absorbent material. In the completed section of workspace, Ive snaps his fingers to demonstrate the warm sound it creates.
I love this quote from Laurene Powell Jobs, because it sounds exactly like something Steve would say:
“The materiality of it is inspiring,” says Powell Jobs. “The quality of the wood, the quality of the stone, the quality of the light—that’s what makes it so beautiful.”
On Apple employees paying for their own food, as opposed to most other large tech organizations which offer it for free:
Apple employees will pay for the food served here, but at a somewhat subsidized rate. “Steve’s philosophy was that when people have skin in the game, they appreciate it more,” says Dan Whisenhunt, Apple’s head of real estate and development.
It’s an interesting philosophy I can see the point of. It’s not like you’re going to choose between Apple and Google solely because of free food, for instance. You would choose Apple because you believe in the culture, mission, and passion.
On the importance of employees being physically together:
Ive and Cook place great importance on employees being physically together at work—ironic for a company that has created devices that enable people to work from a distance. Face-to-face communication is essential during the beginning of a project, when an idea is sprouting, they say. Once a model emerges from a series of conversations, it draws people in and gives focus. “For all of the beauty of technology and all the things we’ve helped facilitate over the years, nothing yet replaces human interaction,” says Cook, “and I don’t think it will ever happen.”
Again, I can relate to this professionally. While technology allows us to work from anywhere, meeting in person or by random happenstance almost always makes for better progress.
On the workspaces being more open and less confined:
The thousands of employees at Apple Park will need to bend slightly to Ive’s vision of the workplace. Many will be seated in open space, not the small offices they’re used to. Coders and programmers are concerned that their work surroundings will be too noisy and distracting. Whiteboards—synonymous with Silicon Valley brainstorming—are built into floor-to-ceiling sliding doors in the central area of each pod, but “some of the engineers are freaking out” that it isn’t enough, says Whisenhunt. iPhones will be the primary mode of communication for everyone, though individuals can also lobby for a desk phone, if they feel they have a need for one.
Going from an office to open space will probably be a little shocking, and some might not be able to adjust, but adaptation will be key.
On Apple supposedly contributing to a tree shortage:
Ive takes offense at the idea that he hasn’t already thought of every detail during the years of planning Apple Park. He scoffs at an article claiming that Apple contributed to a tree shortage in the Bay Area by buying up so many plants for the campus, “as if we’d got to the end of our project and we thought, Oh, we’d better plant some trees.” Apple began working with an arborist years ago to source trees, including varieties that once made up the bountiful orchards of Silicon Valley; more than 9,000, many of them drought-resistant, will have been planted by the time the campus is finished.
On Ive getting back to normal work after Apple Park is fully up and running:
In the next few months, Ive will transition from being the creator of Apple Park to one of its thousands of users. His design team is scheduled to be one of the last to move into the new headquarters this fall—around the same time as the event at which Apple has typically unveiled its new iPhone. The next frontier Ive faces, beyond reinventing a greatest hit, is how to further embed technology onto our bodies and into our homes, using devices such as the Apple Watch, AirPods and HomePods as the beachheads for collecting data and tracking ourselves. “Everything we design and make in the future is going to start right here,” he says.
With each new product Apple rolls out, its predecessors seem a little antiquated. But Ive and Jobs built Apple Park to last, and their legacy will be etched into the glass, concrete and trees for decades to come. Just as the ring blurs the boundary between inside and outside, Ive’s personal and professional lives are fluid. As a designer, “you spend so much time living in or living with the solution that doesn’t yet exist,” he says. “I’m just looking forward to going to see an engineer I’m working with on something, to sit there and perhaps walk out and sit outside for a bit with him, to be able to go to the workshop and start to see how we’re building something.”
Ive’s longevity at Apple has been questioned a bit in the past couple years, with his promotion to Chief Design Officer. Subsequently, this allowed for the shedding of his managerial duties through the appointment of Alan Dye and Richard Howarth to the positions of VP, User Interface Design and VP, Industrial Design respectively. It was also speculated that he wanted to return to the U.K. in a larger capacity.
Ive is still a large part of Apple, and judging by the last two paragraphs, it sounds like he has a renewed focus to get back to his other design work, thanks to the new campus.
If Christina means a car, then it goes without saying that Apple is behind the entire automotive industry, not just Tesla. ↩︎
We’re gradually increasing our reliance on smart assistants, but they are far from perfect. Going hand-in-hand with them is the next mainstream computing input method: voice. Sure, voice control has been around for a while, but we’re turning the corner on it being used in extremely meaningful ways throughout the course of our daily lives.
As a big proponent of voice input and smart assistants, here’s a couple improvements that would be a next step in the right direction when it comes to improving the interaction experience.
Picture this: your little one just fell asleep, and you go to turn on the nightlight in the room with your Amazon Echo like you always do. It goes a little something like this.
You: Alexa, turn on the nightlight — oh shit…
Alexa at full volume: OKAY!!!
Now you have to coerce your little one back to sleep. This can apply to using Siri on the iPhone or iPad, too. Sometimes I want to set the Good Night scene using Siri on my phone, but Siri’s volume is set differently from the system volume, so I’d rather not chance what it was last set to.
These assistants need to find a way to adapt their volume for the situation, based on multiple factors. If it’s late at night and quiet, it’s probably safe to say I don’t want to hear any feedback at all from Alexa, Siri, or the like. Maybe at a volume level of 3-4, but definitely nothing louder.
Conversely, if there’s a lot of noise in the room, bump that volume up so I can hear the response. All of these devices have multiple microphones built in, so it’s just a matter of software.
In short: don’t take my manual volume change as law if it doesn’t make sense for the situation. This is an instance where a computer should be allowed to decide something for us.
Give us a volume request modifier. Two examples:
You: Alexa, quietly turn on the nightlight.
Alexa changes to low volume: “Okay.”
Alexa then reverts back to original volume.
You: Alexa, loudly, what time is it in New York?
Alexa changes to full volume: “THE TIME IN NEW YORK IS 11AM!!!”
Alexa then reverts back to original volume.
Pretty straightforward. Let us string at least two commands together for controlling smart home devices. Perhaps I want to selectively control two devices at a time with Siri that aren’t part of a scene I’ve already configured. For example:
Hey Siri, turn off the foyer and living room lights.
Hey Siri, unlock the door and turn on the porch light.
This would be a huge step in improving the manual control experience of smart home devices, instead of one singular command at a time.
David Pierce from Wired met with Andy Rubin to discuss Essential’s plan to unite the smart home market. There’s a lot of great stuff in the article, but here’s a few notable quotes.
Rubin doesn’t employ human security guards. He doesn’t think he needs them. The 54-year-old tech visionary (who, among other things, coinvented Android) is pretty sure he has the world’s smartest house. The homebrew security net is only the beginning: There’s also a heating and ventilating system that takes excess heat from various rooms and automatically routes it into cooler areas. He has a wireless music system, a Crestron custom-install home automation system, and an automatic cleaner for his pool.
Getting the whole place up and running took Rubin a decade. And don’t even ask him what it cost. There’s an entire room full of things he bought, tried, and shelved, but the part that really drove him crazy was that it didn’t seem like automating his home ought to be this hard. Take the license-plate camera, for instance: Computer-vision software that can read a tag is readily available. Outdoor cameras are cheap and easy to find, as are infrared illuminators that let those cameras see in the dark. Self-opening gates are everywhere. All the pieces were available, but “they were all by different companies,” Rubin says. “And there was no UI. It’s not turnkey.”
This is indeed the problem with the consumer smart home market, as well. There are so many options and not everything works together easily.
If the market continues this way, you’ll be forced to buy all the way into one company’s vision, essentially ripping your house to the studs and replacing everything with Samsung-approved sensors to work with your Galaxy phone, or gadgets from the Apple Store to work with your iPhone. Otherwise, there’s a good chance your lights won’t work with your music system, and the front door and television won’t be on speaking terms. Rubin makes one point over and over throughout our conversations: If the way people interact with their connected home is through smartphone apps, the connected home will never go anywhere. If you have to open an app, log in, and tap around just to open your front door, only to open, log in, and tap around another to turn up the thermostat, nobody will do either.
I’m not sure if buying into one company’s vision is necessarily a bad thing. With it comes the assurance that said system is going to work as it is engineered to, and hypothetically better than a hodgepodge of different technologies forced together. For instance, right now, IFTTT is a popular service for automating systems that don’t inherently talk to each other. It works, but it isn’t perfect, and it requires additional setup.
I agree that using multiple smartphone apps to control your devices is extremely cumbersome. Voice will rule this space when it comes to interface control as Alexa and Siri improve over time, but the real killer app will be automation—when you don’t have to manually control your smart home.
And that’s where Essential’s most important product comes in. Ambient OS—Rubin describes it as “Android, but evolved”—is a universal translator for the smart home, combining all the major smart home products and platforms into a single elegant system and interface. That’s how it will look to users, anyway. Behind the scenes it’s just an elaborate hack. “I plug into SmartThings, I plug into HomeKit, I plug into Thread and Weave, and I get a hundred thousand devices that I can control with my UI,” Rubin says. In the background, Ambient’s job is to strip the barriers between devices so users don’t have to worry about compatibility. They should buy a light bulb, screw it in, and trust that it’ll turn on when it’s told to. And the code that runs the operating system will be publicly available, so outside developers can create new stuff that works with it seamlessly.
On paper, this sounds like a great idea — one device to rule them all. On the other hand, as Pierce says, it’s “an elaborate hack”. Furthermore, while Amazon is happy to license out the Alexa voice service, I wouldn’t imagine Apple being OK with a competing smart speaker hooking into HomeKit, and definitely not Siri.
The team wanted Essential’s first products to be unique, sleek curios that would seem exclusive and exciting in a sea of identical aluminum rectangles, and offered manufacturers a chance to show off their best work at a more achievable scale.
The Essential Phone uses titanium, ceramic, and has an edge-to-edge screen, but it still looks a lot like an aluminum rectangle. Actual reviews are yet to be seen, as it missed its initial July delivery window, but is supposedly coming ‘in a few weeks’.
The Essential Home admittedly looks much more unique than an Echo or Google Home.
In the team’s imagination, once your home has been properly kitted out with connected devices, there’s no controller there at all. You don’t say a wake word or turn on a screen or enter a password. You definitely won’t have to get your phone out just to turn on the lights. You just declare your needs, in whatever way makes sense at the moment—voice command, touchscreen—and they’re taken care of.
I like the idea of not using a wake word one day. Having to preface everything with “Alexa”, “Hey Siri”, or “OK, Google” can get tedious, especially if you need to string a few commands together in succession.
Long-term, Rubin is banking on machine learning to make technology far more useful and intuitive. In many cases, you won’t have to touch or speak to your devices at all. That’s the full promise of ubiquitous computing: Everything just works. It’ll know what you want because it watches you and learns that it should start warming up the car when you’re putting your shoes on, because that’s always the last thing you do in the morning. When you say, “Tell Anna it’s time for dinner,” the system should know who Anna is, which room she’s in, and which speaker to use to alert her. The only way for that to work is if absolutely everything is connected.
And even if it does work eventually, Rubin will need to reassure users that the suite of always-listening devices tracking their every move is not a threat to their privacy. That’s why Essential built the Home to do a lot of its work on the device itself, without sending data to the cloud. Rubin, worryingly but perhaps unsurprisingly for the founder of Android and a longtime Google employee, isn’t terribly concerned about the privacy issue. Mostly, he says, it helps that he’s not selling your data. He doesn’t even want it. He’s selling you products, not ads.
This would be the ultimate smart home utopia, but there are good notes on privacy here. Apple is king of privacy and security in any market they enter. HomeKit offers end-to-end hardware (and software encryption with iOS 11). While many may trust Andy Rubin, Essential has to walk the walk. They have to measurably demonstrate their accountability and be transparent with user data.
Andy’s vision of the smart home future is a grand one, but I’m not so sure we’ll get to the point where everything talks to everything else with ease right out of the box, no matter the manufacturer. It’s like the epitome of this legendary XKCD comic.
iPhone 4 was arguably when the iPhone became the modern iPhone. It was a huge leap from the 3GS. It was really fast, had a sexy glass sandwich design, introduced the Retina display, external antenna system, and more. It even had huge controversies from being lost in a bar and published by Gizmodo to antennagate. Along the same lines, I think Apple Watch Series 3 has the potential to be the ‘iPhone 4’ of its line in terms of performance and adoption (hopefully without the controversies).
I remember being extremely excited for the original Apple Watch’s (delayed) launch back in early 2015, more than any Apple product since the original iPhone. A close match was the AirPods, but that was a different kind of excitement. Now, with Fall quickly approaching, we are seemingly on track to receive another Watch update.
See, it’s all about experiences when it comes to technology. Apple Watch does a subset of things the iPhone does, but the experience it offers is visceral, compelling, and strikingly different than the iPhone. For instance, I’m more compelled to archive email from the Watch because it’s right there on my wrist. Same goes for quickly replying to a message or controlling music playback. These quick use cases and the experience factor make pulling the iPhone out of my pocket seem like a major drag. Using the Watch makes me feel like I’m accomplishing things with the speed of a ninja.
I still have my original 42mm Apple Watch.1 While I love it, it’s an absolute dog when it comes to doing tasks not already loaded in memory. I recently upgraded my wife’s original Watch to a Series 1 and am
on the verge of stealing it insanely jealous of the dual-core processor within. I tested hers by asking Siri to unlock the front door; a task that normally takes my Watch around 25 seconds to complete. Hers did it in less than 10. Considering this, I am impressed with myself for holding out for the Series 3, since it is highly unlike me to not upgrade (mostly) every Apple device upon its new release, but I digress.
Before Series 2 was announced, I thought all I wanted was a faster watch. The more I thought about it, though, the more I decided to wait for Series 3.2 While I knew Series 2 would be faster, I’m holding out for a major increase in speed, as I shouldn’t have to wait for my Watch to catch up to my commands. With that said, here’s what I’d like to see in Apple Watch Series 3.
Tim Cook was rumored to be testing a breakthrough blood glucose monitor that connects to Apple Watch. If Apple is making a play for a real ‘Medical Series’ Watch, it would have to pass stringent FDA specifications to be used for real medical collection of data and evaluation of said data. I think there’s a decent chance we could see a ’Medical Series’ this year. Some Health Plans have already been subsidizing the cost of the regular Apple Watch, and Apple is rumored to be making a play in healthcare. Makes me think this will happen eventually.
Announcements from WWDC had virtually no leaks, and we were bombarded with awesome updates. I think we’re in for a similar surprise with Apple Watch this fall.
My other bet is on Apple Watch playing a bigger role in Apple’s ecosystem down the line, potentially involving AR. Imagine the Watch’s motion data being used as a input mechanism for future Apple AR glasses or similar. Sounds cool, right? Let’s get that future here as fast as possible. Bring on Series 3!
Jon Brodkin for Ars Technica:
Verizon Wireless customers this week noticed that Netflix’s speed test tool appears to be capped at 10Mbps, raising fears that the carrier is throttling video streaming on its mobile network.
When contacted by Ars this morning, Verizon acknowledged using a new video optimization system but said it is part of a temporary test and that it did not affect the actual quality of video. The video optimization appears to apply both to unlimited and limited mobile plans.
But some YouTube users are reporting degraded video, saying that using a VPN service can bypass the Verizon throttling. The Federal Communications Commission generally allows mobile carriers to limit video quality as long as the limitations are imposed equally across different video services despite net neutrality rules that outlaw throttling. The net neutrality rules have exceptions for network management.
“We’ve been doing network testing over the past few days to optimize the performance of video applications on our network,” a Verizon spokesperson told Ars. “The testing should be completed shortly. The customer video experience was not affected.”
I’m sorry, but what the fuck?
I’m not saying carriers shouldn’t be allowed to conduct tests on their own network, but Verizon did this in the most shady way possible. No notice was provided to customers who are paying for an expected level of service, and it was only discovered due to some clever sleuthing.
Verizon is in clear favor of removing Net Neutrality regulations, so I guess we shouldn’t be too surprised at their latest bullshit.
Today, Apple announced a new journal (read: blog) to catalog their machine learning findings.
Welcome to the Apple Machine Learning Journal. Here, you can read posts written by Apple engineers about their work using machine learning technologies to help build innovative products for millions of people around the world. If you’re a machine learning researcher or student, an engineer or developer, we’d love to hear your questions and feedback. Write us at [email protected]
In the first entry, they discuss improving the realism of synthetic images by using large, diverse, and accurately annotated training sets.
Most successful examples of neural nets today are trained with supervision. However, to achieve high accuracy, the training sets need to be large, diverse, and accurately annotated, which is costly. An alternative to labelling huge amounts of data is to use synthetic images from a simulator. This is cheap as there is no labeling cost, but the synthetic images may not be realistic enough, resulting in poor generalization on real test images. To help close this performance gap, we’ve developed a method for refining synthetic images to make them look more realistic. We show that training models on these refined images leads to significant improvements in accuracy on various machine learning tasks.
They go into explaining the challenges and methods used to refine synthetic images, demonstrated by the example figure below.
The post is fascinating. Alternate reality and machine learning are the next frontier for computing, and a growing focus for Apple. This is demonstrated by iOS 11’s ARKit and CoreML, which allows developers to easily implement these technologies into their apps. In a recent interview with Bloomberg, Tim Cook talked about autonomous systems and Apple’s focus on them, including software for self-driving cars, calling it “the mother of all AI projects”.
Some are worried Apple is limiting themselves in these areas because of their privacy and security standpoints. It’s a self-imposed limitation, yes, but that could be why they are being more open about publishing their findings on efforts in this space—to attract like-minded individuals who have the same passion and belief system. For example, all machine learning features on iOS right now are done on-device. No identifiable data is sent back to iCloud and analyzed by a super computer to suggest similar faces in the Photos app, for instance. It’s all done by your iPhone or iPad. Mark Gurman even reported back in May that Apple is developing an ‘AI’ chip to specifically handle these tasks, similar to how the motion co-processor handles all motion data. Makes total sense.
I would much rather have the comfort knowing my device is doing all the work if it comes at a cost of speed to market. Besides, it’s only an inevitability that our machines will do more for us on their own. Apple may take a little more time to get there, but that’s their M.O. iPhone wasn’t the first smartphone, Apple Watch wasn’t the first smartwatch, but both products are now the benchmark in their markets. Apple will do this right, as opposed to other companies who live on getting their hands on your data, and it will be the benchmark for machine learning privacy.
Apple is indeed a secretive company, but under Tim Cook’s direction we are seeing them embrace the ability to be more open. One prior example is the open sourcing of Swift. It makes me excited to see what will come next as a result of this openness.