Legal Knowledge Management and the Rise of Artificial Intelligence
Robotic Refactoring the Workplace
Why Your Next Insurance Claims Processor Could be a Robot
Building an AI Based Machine Learning for Global Economics
The Forgotten Element in Your Big Data Strategy
HK Bain, CEO, Digitech Systems
"AI -The Future of Automotive Industry"
Nitin Sethi, Global IT Director - Business Transformation & Engagement, Visteon Corporation
WiFi Networks: Shifting from Providing a Service to Improving the...
Daniel J. Strojny, Interim Associate Director of Network and IT Operations, University of St. Thomas
Breaking the Stereotypes in the Development of AI
Yves Jacquier, Executive Director, Production Studio Services, Ubisoft
Thank you for Subscribing to CIO Applications Weekly Brief
From IoT to AI Chips
FREMONT, CA: The internet of things (IoT) continues to evolve, but the focus of the semiconductor industry is shifting toward AI chips, namely algorithm-specific ASICs, SoCs, ASSPs, and accelerators for artificial intelligence (AI). The rapid rise of AI will result in a radical growth of the semiconductor industry, especially in the field of deep learning algorithms. The focus is changing from data generation to data analysis with the emergence of deep-learning algorithms.
The utilization of AI algorithms for analysis has broadened the limits of CPUs, FPGAs, and GPUs. The demand for quick neural network training and higher interference efficiency is driving the semiconductor manufacturers to change their approach. A blend of processing elements is necessary to run the AI algorithms for applications such as autonomous vehicles, financial markets, agriculture, blockchain, and smart cities.
There is a need for powerful and efficient interference engines at the edge and data centers. The solution providers can utilize the heterogeneous processing elements, memory, and the relevant architectures to drive efficiency and performance of the AI solutions. Original equipment manufacturers (OEMs) and dedicated chip houses will invest in SoCs, ASICs, and ASSPs to implement advance algorithms at the edge and data center. AI at the edge will not only result in lower latency response times but will also reduce the bandwidth and storage costs.
With the implementation of neural networks in smart devices, the demand for AI at the edge is growing. The investment in interference and training at the data center is predicted to grow up to $5 billion. Smart silicon providers are steadily moving toward silicon production to meet the demands of the changing trends. IoT is slowly shifting from connecting smart devices toward facilitating continuous training and predictive analysis of AI.
The semiconductor companies are taking different approaches to meet the demand for AI solutions for interference processing at the network edge. Deep learning technologies have given rise to new systems and architectures. Several companies already develop chips with AI capabilities, but many more are expected to join the race.
Check this out: Top IoT Companies