AI, AI, and guess what? More AI. Artificial intelligence is one of the main topics at CES this year, with at least a whopping 50 sessions dedicated to the subject. AI is everywhere, and it is coming to the edge, too.
At least, that is what some experts believe. “Why do I have to run any AI inferencing at the edge to begin with? The first layer to answer this question is because we can,” said Durga Malladi, Senior VP and General Manager of Qualcomm, during a panel at CES.
“The computational power we have in devices today is significantly greater than we’ve seen in the last five years, especially with processors being designed from the ground up for AI inference. If you can run it directly on your device, that’s great,” he added.
However, we need to take a step or two back if we want to accurately assess the benefits of employing artificial intelligence to the edge.
Starting with a simple question: what is the edge, after all?
“Our definition of edge is practically every single device we use in our daily lives today,” Malladi explained.
“This can range from smartphones to PCs but could also be an enterprise server that is right in your building or your Wi-Fi access point, which could act as a hub now for a lot of other processing. Not just devices, but something that is absolutely close to them.”
If the edge is the device itself or anything immediately close to it, then processing there makes all communications and interactions faster for the end-user. Not just that, however.
“There are some key elements of AI,” observed Nicole Peng, Research VP, Consumer at Informa. “[We can cite] computation, the model, and there is the data. The edge is where we collect the most data. So, why not bring AI to the Edge to compute in there as long as we have the power and then the model ready?”
According to the experts, the apparent clear answer seems to be “Yes, we definitely should.”
More Than Latency
One of the strongest cases for processing data at the edge is lowering latency. The smaller the latency, the better the end-user experience. This also means unlocking more use cases like XR technologies or connected cars.
However, what sounds even better is that a good share of your data information would potentially remain local.
“The data is generated right at the edge, which means you want to tap into some local context and keep some of the private data, tapping into that, providing the right answers from whatever it is that you’re doing with AI inference,” Malladi conjectured.
AI Going for Enterprise
While end-users or consumers – just like you and me – should benefit from the approach, Steve Long, SVP and GM of Commercial Segment IDG at Lenovo, sees another area that should significantly improve with the addition of AI to edge processing.
“I think AI [combined with the edge] is going to take us to the ‘other edge,’ which is the devices that are computing data, the things that are capturing sensors and changing and transforming industries. Like in retail, like the cameras, the smart home applications,” he pointed out.
That is a lot to process, though. What experts hope for is that AI will help them do the job – at the edge, preferably.