Artificial Intelligence is playing a bigger role in the medical care industry. It has expanded on its role by detecting more severe and life-threatening conditions that the usual diagnostic tools can't. AliveCor, a company known for branding KardiaBand has developed a way by using machine learning to detect heart conditions - specifically Long QT Syndrome (LQTS) which usually doesn't gain much detection.
The QT interval is measured by the influence of time between the start of the Q wave and end of the T wave in the electrical cycle of the heart which is also known as the amount of time it takes for the heart to recharge between beats. LQTS is usually meant to mean that it takes longer than normal for the heart to re-charge and it can lead to blackouts, seizures, palpations and death. It's not a very common condition, either. It affects only one in 2,000 people and is usually caused by drugs with QT-longing potential. It isn't very easy to diagnose, either because 50 percent of patients who have been diagnosed with this condition have a normal QT interval on their electrocardiogram (ECG).
AliveCor has successfully presented their findings that show how deep neural network AI technology can identify and diagnose LQTS patients despite their ECG reading turning out as normal. The findings show a specificity of 81%, sensitivity of 73%, and accuracy of 79%. It has found that both AliveCor's KardiaMobile
AI has also come into play when dealing with human cells and having a deeper look into them and what they can do.
The Allen Institute has made some significant progress in this field by developing a live 3D model of a human cell, called the Allen Integrated Cell and it is something that could change the course of how we view things. It's a new and interesting way to see inside human cells and can be seen as if you were looking inside the cell for the first time. It has the potential to improve drug discovery, research on diseases and how basic studies on human cells are composed.
Scientists edited the gene-collections of live human cells to add in fluorescent protein tags that light-up certain structures inside the cells. The team used AI before taking thousands of images of the cell to predict the likelihood of the location and shape of the structure in the cell. Based on the shape of the plasma membrane and nucleus.
Afterward, the team applied a different algorithm to those images which were able to use what it learned from the cells with fluorescent labels to find structures in cells that didn't have the labels attached to them. The system was then able to identify those images and combine them to make them look identical to traditional fluorescent microscopy.
Scientists are now able to see what is going on inside human cells - which is something they weren't able to do with such ease before. They were only able to see proteins in a cell that were labeled for further study. The Allen Integrated Cell completely changes that with a wide array of proteins being offered for visualization. It's an enhanced and powerful way to do cell biology and can be useful in the future for these reasons.
Have a story tip? Message me at: cabe(at)element14(dot)com