Will Artificial Intelligence Destroy Humanity
Cognitive Automation and the Disruption of Business Services
Legal Knowledge Management and the Rise of Artificial Intelligence
UC Health Explores Data Science to Help Improve Operational...
The Difficult Road to a More Secure Future
Mark Raymond, CIO, State of Connecticut
Artificial Intelligence - A Transformational Journey
Gail Evans, Global CIO, Mercer
Adopting Big Data Tools Helps In Decision Making
Ellen Watson, CIO, University of Memphis
Big Data and Cloud Computing - The Next Step for Robot Intelligence
Ayanna Howard, CTO, Zyrobotics, LLC
How does AI Benefit Drug Testing?
Testing on animals is an integral part of the drug and chemical compound development and approval procedure of today. This is because scientists cannot accurately predict the properties of new chemicals, let alone how they interact with living cells. A new paper published in Toxicological Sciences, a research journal, shows that predicting the attributes of new compounds is possible if we use the data on past tests and experiments. The artificially intelligent system could be trained to predict the toxicity of unknown chemicals based on previous animal tests, which lead to results that are sometimes more accurate and reliable than the actual test.
The use of AI in drug development is not a new phenomenon. With 28 pharma companies and 93 startups spending several millions to apply machine learning and AI to the process of drug discovery, the industry needs an AI-based disruption. AI can aid in designs and decisions regarding compounds that can be made and tested, leading to fewer experiments, which saves time and money.
While people think it is difficult to use AI in the field of testing as it is messy and complicated, these are the main factors contributing to the use of AI. AI enables one to use messy, imperfect, difficult situations to determine the accuracy of predictions. Bayesian approaches embrace the uncertainty in messy data and work best in such situations.
Big data makes it possible to produce a tool that is more predictive than animal tests like dropping compounds into rabbits’ eyes to check for the presence of irritants, or feeding them to rats to identify lethal doses. This predictive approach was made possible by feeding a vast amount of data to their AI, that was harnessed from datasets collected by the European Chemicals Agency (ECHA) under the REACH (registration, evaluation, authorization, and restriction of chemicals) law of 2007. While this data is widely available, the format is not readable to most computers. This data was reformatted by Thomas Hartung, a toxicologist at Johns Hopkins University in Baltimore, and his crew, to make it feedable to machines, which provided information on 10,000 chemicals and their properties, gathered in over 800 animal tests. The system can now predict the toxicity for several thousand chemicals for nine different test types, and provide information on everything from inhalation damage to effect on aquatic ecosystems.
Reducing drug development-related animal tests is not just a noble cause for humanity and animal rights, but it also shortens the process and makes it cost-effective. The Interagency Coordinating Committee on the Validation of Alternative Methods prepared a roadmap to replace animal use in toxicity testing in February 2018.
Even as computer systems and AI are gradually entering and replacing a majority of standard safety tests annually carried out on animals, other long-term effects like carcinogenic tendency or impact on fertility are yet to be considered. However, the current prospects are highly beneficial to both animal rights activists and consumers.