Sponsored Links
-->

Friday, May 11, 2018

Google's Tensor Processing Unit: The AI Market Is Shifting ...
src: static.seekingalpha.com

A tensor processing unit (TPU) is an AI accelerator application-specific integrated circuit (ASIC) developed by Google specifically for neural network machine learning.


Video Tensor processing unit



Overview

The tensor processing unit was announced in 2016 at Google I/O, when the company said that the TPU had already been used inside their data centers for over a year. The chip has been specifically designed for Google's TensorFlow framework, a symbolic math library which is used for machine learning applications such as neural networks. However, Google still uses CPUs and GPUs for other types of machine learning. Other AI accelerator designs are appearing from other vendors also and are aimed at embedded and robotics markets.

Google's TPUs are proprietary and are not commercially available, although on February 12, 2018, The New York Times reported that Google "would allow other companies to buy access to those chips through its cloud-computing service." Google has stated that they were used in the AlphaGo versus Lee Sedol series of man-machine Go games, as well as in the AlphaZero system which produced Chess, Shogi and Go playing programs from the game rules alone and went on to beat the leading programs in those games. Google has also used TPUs for Google Street View text processing, and was able to find all the text in the Street View database in less than five days. In Google Photos, an individual TPU can process over 100 million photos a day. It is also used in RankBrain which Google uses to provide search results.

Compared to a graphics processing unit, it is designed for a high volume of low precision computation (e.g. as little as 8-bit precision) with higher IOPS per watt, and lacks hardware for rasterisation/texture mapping. The TPU ASICs are mounted in a heatsink assembly, which can fit in a hard drive slot within a data center rack, according to Google Distinguished Hardware Engineer Norman Jouppi.


Maps Tensor processing unit


Generations

First generation

The first generation TPU is an 8-bit matrix multiply engine, driven with CISC instructions by the host processor across a PCIe 3.0 bus. It is manufactured on a 28 nm process with a die size <= 331 mm2. The clock speed is 700 MHz and it has a thermal design power of 28-40 W. It has 28 MiB of on chip memory, and 4 MiB of 32-bit accumulators taking the results of a 256x256 systolic array of 8-bit multipliers. Within the TPU package is 8 GiB of dual-channel 2133 MHz DDR3 SDRAM offering 34GB/s of bandwidth. Instructions transfer data to or from the host, perform matrix multiplications or convolutions, and apply activation functions.

Second generation

The second generation TPU was announced in May 2017. Google stated the first generation TPU design was memory bandwidth limited, and using 16 GB of High Bandwidth Memory in the second generation design increased bandwidth to 600GB/s and performance to 45 TFLOPS. The TPUs are then arranged into 4-chip modules with a performance of 180 TFLOPS. 64 of these modules are then assembled into 256 chip pods with 11.5 PFLOPS of performance. Notably, while the first generation TPUs were limited to integers, the second generation TPUs can also calculate in floating point. This makes the second generation TPUs useful for both training and inference of machine learning models. Google has stated these second generation TPUs will be available on the Google Compute Engine for use in TensorFlow applications.

Third generation

The third generation TPU was announced on May 8 2018, eight times more powerful than the second generation.


Stealth AI chip maker founded by ex-Google engineers raises $10.3M ...
src: d15shllkswkct0.cloudfront.net


See also

  • Vision processing unit a similar device specialised for vision processing.
  • TrueNorth a similar device simulating spiking neurons instead of low precision tensors.
  • Neural processing unit

At I/O, Google debuts new Android O, AI chip, new Assistant and ...
src: siliconangle.com


References


Google built a processor just for AI
src: s.aolcdn.com


External links

  • Photo of Google's TPU chip and board

Source of the article : Wikipedia

Comments
0 Comments