Similar technology used for two different purposes by two tech giants
(images via Alamy Stock Photo/Amazon)
Our eyes are one of the most important organs of the human body, which is why it's critical to keep them healthy. But sometimes eyes can get damaged or contract a disease with few warning signs. What if there was a way to be diagnosed before symptoms grew worse? A new collaboration between Google and the UK's National Health Services (NHS) may have the answer. Google's DeepMind division is teaming up with the NHS a second time and working with Moorfields Eyes Hospital in east London. The goal is to build a machine learning system which will eventually be used to recognize sight-threatening conditions with just a digital eye scan.
Though this is the second time the two organizations have worked together, it's the first time DeepMind is conducting purely medical research. The previous collaboration with the Royal Free hospital in north London, focused on direct patient care using a smartphone app called Streams to monitor kidney function of patients. The collaboration with Moorfields is also the first time DeepMind has utilized machine learning in a healthcare project. Most of the research is based on millions of anonymous eye scans, which researchers will use to train an algorithm to spot early signs of eye conditions like age related macular degeneration and diabetic retinopathy. These are conditions that can be prevented or limited if detected early enough.
Some may remember that DeepMind's previous project with Royal Free Hospital was criticized over privacy issues. The hospital gave the company access to 1.6 million patient records, which many were understandably upset about. To circumvent more issues, DeepMind makes it clear that the scans have been collected over a long period of time. The company assures those worried about their information being used that they're “historic scans” and there's no way to identify people from the scans. They also plan to submit its research to peer-reviewed journals, “so others in the medical field can analyze them.”
But Google isn't the only one using deep learning for larger purposes; Amazon is getting on in the action as well. Amazon's robotic picking challenge, an annual competition that searches for robots to work in their warehouses, has passed and the winner has been picked. This year's winner from TU Delft Robotics Institute in the Netherlands succeeded in picking items from a mock Amazon warehouse self at a speed of 100 an hour and a failure rate of 16.7 percent. These stats are an improvement over last year's winner, but slower than what a human can do (around 400 items an hour).
The key to Delft team's success was artificial intelligence. To help the robot succeed, researchers employed deep learning methods to look at 3D scans of the items the robot had to pick out and replace. A customized gripper-suction arm attached to the robotic arm did the mechanical work, while the AI-powered software helped the robot be faster and more efficient than the competitors.
The contest, which is like The Voice for robots, is held to find the most efficient bots, so they can eventually be placed in Amazon warehouses. Aha! So, robots ARE replacing humans! Not true, according to Amazon. The company reassures us that the point of the competition is not to replace established workers. Rather it's about augmenting the workforce and letting them do more. "Robotics enhance the job for employees but does not replace them," said an Amazon spokesman."In fact, we continue to hire. Many of those roles are being created in buildings where employees are working alongside Amazon robotic drive units."
These two projects are just examples of what deep learning is possible of. Whether it's improving patient care or making a better robot, the use of AI technology is vast and can be used for various purposes. Expect to see more companies and researchers take advantage of the technology.
Have a story tip? Message me at: