Iphone x cool

I went back to Hong Kong in the beginning of this month.  and I took this picture on the foot bridge at the entrance to western harbor crossing from Kowloon Side.

All i have say is, this phone is awesome. First one is with iPhone x and the second one is with my old iphone 6. Standing at the same place, those two phones told a different story.

This was exactly the moment I wanted to quote “Different people remember things differently, and you’ll not get any two people to remember anything the same, whether they were there or not. You stand two of you lot next to each other, and you could be continents away for all it means anything.” by Neil Gaiman.

People everywhere told me, the new iPhone is too expensive. But the face ID, animoji, awesome camera. Is it worth it? Purely my opinion, yes, I paid almost 10k for this phone and i think it’s a good spending of money.

I thought the face ID would be bad as touch ID, which i rarely use. But the face ID is great. the first two days, it was struggling, but soon its learning algorithm started to capture my face feature, and by the time I was sitting on my favorite Emirates flight back to Sweden, it worked without me having to take my sun glasses off. That was the moment of true”WOW”.

Another cool thing is the new feature from photo. I have to admit that I never upgrade IOS because of what happened from IOS6 to IOS7 about the appearance of default apps. This “best of 2017” was a huge surprise. It captured every single moment that was in my trip back to Hong Kong. it felt very personal and with the music (quite some strong Chinese element inside) and perfectly cropped the long video of the whole take off from Dubai. I would not be able to make such a good video. It told a perfect story.

Is deep learning the next big thing? I have had friends sharing on instagram about this “best of ” video series and seems quite a lot very positive feedback. Now i am seriously interested in deep learning algorithms with neural networks. Perhaps, that is the way to lead us finding our individual utility function, and with some tracking inputs from user, it can eventually predict the marginal utility.

That would be seriously cool, and open the door of wants = needs.