Programmers can install TPU chips on their own machines in October, but it’ll cost you. These tiny chips help make AI tasks more efficient. (Photo via Google)
A couple of years ago, Google revealed its Tensor Processing Units (TPUs), specialized chips housed in the company’s data centers that make AI tasks easy and efficient. For instance, the TPUs can speed up AI tasks like understanding voice commands or to identify objects in pictures. Now, they’re ready to share this technology with the masses thanks to a new program called Edge TPU.
Edge TPU allows programmers to install the chips on their own machines for a price starting this October. The tiny chip augments Google’s Cloud TPU and Cloud IoT that provides and end-to-end infrastructure to take care of your AI- based needs. It can be used for a wide range of applications, like predictive maintenance, anomaly detection, machine vision, robotics, and voice recognition just to name a few.
These chips are designed to be used on enterprise jobs, like quality control checks in factories so don’t expect to use them in your next smartphone. Using the chips have many advantages over the standard over the data internet analysis. One device machine learning is usually more secure, experiences less downtime, and gives you faster results.
"There are also many benefits to be gained from intelligent, real-time decision-making at the point where these devices connect to the network," without having to wait for a trip over the network to Google's machines, Injong Rhee, vice president of Google Cloud's internet of things work, said. "Your sensors become more than data collectors -- they make local, real-time, intelligent decisions."
Making Google TPUs available to the customer makes sense. It could increase the number of customers and the number of jobs interested in Google’s technology. It also puts them in direct competition with Microsoft’s Azure computing service, which aims to extend their own AI processing technology to customers via Project Brainwave.
Unlike Google’s TPUs, Project Brainwave uses an unusual processor type called a field programmable gate array (FPGA), that’s both fast and flexible. Similar to Edge TPU, it also promises to accelerate AI chores using the latest algorithms and can handle AI tasks fast enough to be used for real-time jobs. You’ll be able to run AI jobs with Microsoft hardware at your own job sites.
Google TPUs will be available in a hardware module that can be plugged into a computer using a PCI Express expansion slot or a USB port. Those who are interested can apply to get online access to Google’s AI chips here. No word on how much it’ll cost to actually get your hands on one of these.
Have a story tip? Message me at: cabe(at)element14(dot)com