Will ASIC Chips Become The Next Big Thing In AI?

https://www.forbes.com/sites/moorinsights/2017/08/04/will-asic-chips-become-the-next-big-thing-in-ai/#9da6de711d9f

This is a very good article that explains the hardware side of artificial intelligence.  It is important to understand both the hardware and the software side of this groundbreaking technology.

From the article:

Technically, a GPU is an ASIC used for processing graphics algorithms. The difference is an ASIC offers an instruction set and libraries to allow the GPU to be programmed to operate on locally stored data—as an accelerator for many parallel algorithms. GPUs excel at performing matrix operations (primarily matrix multiplications, if you remember your high school math) that underlie graphics, AI, and many scientific algorithms. Basically, GPUs are very fast and relatively flexible.

Follow Attorney Nessler to receive notifications when new recommended readings post, because it is important to keep up on how technology is changing the practice of law and law firm management.