On-boarding Artificial Intelligence – A Reality Check
Recently, iPhone 8 was launched with much fanfare. There were advertisements all over the places during this festival season announcing its arrival and availability. Apple is emphasizing on 4 differentiators in the new iPhone, namely All-glass design, Advanced cameras, Wireless charging and A11 bionic chips. Coming from the stable of Apple, these are all bleeding edge innovations. While the first 3 are straightforward, what caught my attention was the coming-up the age of mobile computing chips as a major differentiator. On further diving-in, I somehow feel it is evident that this product will take us very much close to an Artificial Intelligence (AI) system which is meant to meticulously monitor the owner of the phone and learn from his behavior.
The A11 bionic chip that I am talking about here has an embedded AI unit which is a SoC (System-On-Chip) and Apple has named it as Neural Engine. While there is a certain amount of novelty associated with whatever Apple does, there are some more players in this game who are equally investing in this technology. Huawei has its latest smartphone SoC, the Kirin 970, and they call this as Neural Processing Unit (NPU). These developments strongly indicate that onboard AI has finally arrived and will be a significant improvement over the cloud AI that we have been experiencing so far.
However, another major and the largest smartphone SoC manufacturer Qualcomm has a different take on this. It has developed a Software Development Kit (SDK) as its own NPU. In its Snapdragon 600/ 800 series processors, this will allow developers to optimize apps for running AI algorithms.
Is this an End of Cloud AI?
So, has the countdown begun for cloud AI applications with this development and trend towards on-board AI? Not really. While on-board AI certainly has some advantages related to processing local data and speed of processing with dedicated space on the device chip, cloud AI will always retain the power for in-depth analysis and processing of large amount of non-contextual data. Also, it will be difficult for on-board AI to interact with other on-board AIs of peripheral devices. For this interaction to happen, it must either go through the cloud route with some sort of a brokerage in between or have a trusted security mechanism between devices to share data. Most likely, it is going to be the former for the foreseeable feature.
Another differentiating aspect is regarding the application types. While on-board AI will have an edge for device dependent activities like security, optimizing running apps to maximize battery time and other personal behavioral cloning whenever necessary, the cloud AI will have an edge in processing data, which requires deeper processing and application of classy data models.
Irrespective of all this, the development of on-board AI processing is a significant step forward and will open vistas for developing new types of products & applications. With simultaneous development of chip and AI technology, the top-up applications need to not only keep pace with the development but also undergo frequent changes to adapt and provide increasingly sophisticated functionalities. For this to happen, application and product developers need to follow the best practices not only in product conceptualization but also in development to ensure trouble-free long-term sustenance of products.
For more insights please feel free to connect with me on firstname.lastname@example.org or Tweet @mchinmoy