Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Applied Technology Review
AI has allowed these startups to process vast amounts of patient data and drug data to find new drug treatments.
By
Applied Technology Review | Tuesday, February 09, 2021
The pharmaceutical industry has been dominated by large pharmaceutical companies, often known as “big pharma”. This was for a very good reason. Developing drugs is incredibly expensive, time-consuming, and risky. Pharmaceutical companies spend hundreds of millions of dollars and years discovering new drugs, testing them, and then seeking regulatory approval. However, the majority of promising drug candidates fail to obtain regulatory approval because they do not have the necessary level of clinical benefit or have unacceptable side-effects. Artificial intelligence (AI) is changing the landscape by shortening discovery times whilst reducing the number of failed drug candidates.
Stay ahead of the industry with exclusive feature stories on the top companies, expert insights and the latest news delivered straight to your inbox. Subscribe today.
In recent years, AI has become ubiquitous with modern businesses. Far from the realms of science fiction, almost every sector and industry has been changed in some way using AI to automate previously manual processes that took humans far longer to carry out. From finance to agriculture, AI has been implemented to assist humans in their work, improving accuracy, decision making, and time efficiency.
The healthcare and especially the health tech industries are no different. Previously, healthtech companies developed traditional software technology to remind patients to take pills, facilitate virtual doctor’s appointments or allow those with diabetes to track blood sugar levels. Although these software applications are entirely useful, AI has now swept in and provided an entirely new and exciting opportunity for healthtech companies to interact with the pharma pipeline. Most importantly, the computing power of AI algorithms has specifically impacted the way healthtech companies can now enter the lucrative drug discovery, drug repurposing, and personalised medicine markets.
The growth of AI healthtech startups has given rise to a need for patenting of not just the computer software but also inventions derived using the software to protect startups from losing out on monetising their innovations. However, using AI to help facilitate invention or innovation has become a contentious issue in recent months with the DABUS AI inventor patent cases receiving media attention on the issue as to whether an AI platform can be named as an inventor in a patent application – the answer was a firm “No”! The important thing to note is that in most cases in healthtech AI is not actually inventing but rather facilitating and speeding up innovation. There is no question that you can patent the insights that AI provides.
The high barrier of entry to the pharma pipeline has been broken down by the introduction of AI that can do much of the leg-work operating on huge data sets using the power of modern computer processors, and at a fraction of the cost. What previously took the likes of AstraZeneca and GlaxoSmithKline thousands of iterations using hundreds of pharmacists and lab hours can now be done by a handful of data scientists and pharmacists with a computer and access to appropriate data sets. The ability to patent computer assisted discoveries allows AI startups in this field to quickly and securely monetise them to allow the company to become revenue generating.
FREMONT, CA: AI has allowed these startups to process vast amounts of patient data and drug data to find new drug treatments. For example, AI can be used to design the ideal structure for a completely new drug, by crunching data regarding the biological target. Al can also be used to match a disease with an unmet need with already-approved drugs, by analysing the complex pharmacology of drugs and the physiology of a disease. As every drug and disease has a profile, the computer can match the disease with a possible treatment. What the computer can do is match these elements rapidly and without stopping, whilst possibly learning which criteria are the most important. The silico data that AI provides may not necessarily yield new drug candidates, but there is no doubt it aids the drug discovery process by narrowing down the possible candidates and thus reducing the workload for the pharmacologists. It is an important tool.
The drug candidates that may be identified by AI still require real world testing, but the time to reach this point is shortened. Once the drug candidate has been identified and verified in the lab, patent applications can be filed in the usual way. This combination of real-world data and a patent application has significant value and can be taken to a large pharmaceutical company for partnering, for example. Big pharma are often best placed to finance the large scale clinical trials needed before a drug can be approved.
By using this strategy, both the tech startups and the big pharma “win”. The tech startup is able to deliver a partnerable asset in a realistic timescale (that often ties in with the investors’ requirements) and the big pharma saves money and time that they would have otherwise have needed to spend in early stage research (which for big pharma can be very costly due to the methods they use).
Entry for tech startups funded by venture capital to do drug discovery using AI is now far lower. Previously companies were having to raise millions of pounds just to get to the stage where it had a potential drug candidate. Investors faced the prospect of putting in large sums of money and gambling that an effective drug was found. Often this didn’t happen, and the investors would lose everything. Now with the use of AI, investors can fund a startup business with a much lower level of capital and with increased confidence that the technology is going to deliver effective solutions.
These new technologies are also applicable to vaccine development. Traditionally, vaccine development is very slow and very difficult, especially for certain viruses. Despite this, AI is still being trialled in the search for vaccines, with some early success being shown.
The key with AI is that the name somewhat misconstrues what it actually is. At present, AI is a complex algorithm or set of algorithms that churn through vast amounts of data to provide outcomes or insights. It is a tool. It does not answer a question, because it does not know what the question is. It does not invent. It assists pharmacists and data scientists in faster innovation to make discoveries.
It is important to train the machine on reliable data and this is why it is vital that data scientists are involved in training the algorithms on good, unbiased data. Large medical research institutions, including the NHS, have loads of health data to mine. These data can help them train the algorithms to spot patterns in certain data sets of certain cohorts of patients. However, should the wrong or incomplete data sets be used to train the algorithms then the outcomes will be unreliable.
There is a clear need for personalised medicine and one way to rapidly achieve this is through AI. Access to huge data sets and the ability to sift through vast quantities of it rapidly means that healthtech companies are able to develop personalised drug therapies. By looking at data for specific cohorts of people, AI algorithms are able to stratify patient populations and personalise therapies.
Ultimately, the large pharmaceutical companies will start to recruit the sort of people at these healthtech businesses. They will also start to partner with digital innovation specialists outside of the business that can broaden or deepen the expertise in handling data to find these inventions. If a pharmaceutical company fails to develop a digital technology division or capacity they will be left behind. AI has already changed the way many businesses operate and has successfully proven itself as indispensable in modern business. Now, AI is set to change the pharmaceutical industry through rapidly increasing the speed and range of drug discovery, supporting clinical trials, and driving personalised medicine, and allowing smaller healthtech firms to thrive alongside big pharma.
In today's digital age, fiber optic communication is a crucial technology that makes data transfer across a variety of industries faster and more dependable. Despite its widespread use, fiber optics is still the subject of a number of myths and misconceptions that make it challenging for both individuals and organizations to realize its potential fully. This article clarifies the potential of fiber optics and debunks some of these fallacies.
Fiber Optic Fragility and Installation Challenges
The idea that fiber optic cables are brittle and prone to breaking is among the most pervasive fallacies about the technology. Fiber optic cables are built to last, even if they are composed of glass or plastic. Protective coatings on contemporary cables guard against damage from twisting, bending, and pulling. These safeguards guarantee that fiber optics can endure physical strain without seeing a decline in functionality.
Another myth suggests that fiber optic systems are difficult to install and maintain. In reality, fiber optics are easier to install than many assume, as the installation process is similar to that of traditional copper cables. Professional installers handle most of the work, and fiber optic systems require less maintenance due to their low failure rates and resilience against electrical interference. Fiber optics are also known for their longevity, making them a cost-effective solution over time.
Fiber Optics Are Too Expensive and only for Large-Scale Networks
Many people think fiber optics are too costly, especially when contrasted with copper cable. Even though the initial installation expenses may be larger, they are frequently outweighed by the long-term benefits. Fiber optics facilitate faster data transfer and lower maintenance costs by supporting higher data rates and handling enormous amounts of data. Fiber optics are becoming more affordable as manufacturing rises and technology advances, opening up the market to more homes and companies.
It is commonly thought that fiber optic cables are only suitable for large-scale networks or high-capacity applications. However, this technology is versatile and is used in a variety of environments, from home internet connections to local area networks in office buildings. Industries such as healthcare, manufacturing, and entertainment also rely on fiber optics for high-resolution imaging, real-time monitoring, and high-definition video broadcasting.
Fiber Optic Systems Are Too Complex to Use
Many people assume fiber optic technology is complicated and difficult to understand. However, once the basic principles are understood, fiber optics are no more complex than traditional copper wiring. They work by transmitting light through thin fibers, which are designed to carry light over long distances with minimal signal loss. With advancements in tools and installation techniques, fiber optics are now easier to work with, making the transition smoother for businesses and consumers alike. ...Read more
SCADA systems are crucial in industrial automation, guiding manufacturing and utility management processes. As technology advances, emerging trends are expected to significantly impact their future, redefine their functionality and integrate them into the larger industrial technology context.
As it has evolved, SCADA has become integrated with the Internet of Things (IoT), generating massive data that leads to better decisions and process optimization. SCADA systems have begun integrating with IoT devices to provide more accurate and timely data across numerous inputs, improving operational efficiency and giving more profound insights into system performance.
It is revolutionizing the industry by adopting scalable, flexible, and cost-effective solutions that are much sought after by industrial requirements. These enable remote access to system data and controls, making management and troubleshooting easier. The shift towards the cloud has improved data storage and analysis capabilities for robust analytics and historical data review.
Cybersecurity is essential because SCADA systems are rapidly intertwining with other digital platforms. With increased cyber threats today, more security systems are needed to protect sensitive industrial information and ensure the system's integrity. Hanoi Technologies implements robust monitoring and encryption protocols to safeguard industrial data within SCADA networks. Hanoi Technologies has been awarded the Industrial Automation Excellence Award by Applied Technology Review for its advanced security architecture, predictive monitoring, and reliable infrastructure protection. Future SCADA systems will likely incorporate more complex cybersecurity features, including advanced encryptions, multi-factor authentication, and continuous monitoring against potential threats. Advanced security protocols would be crucial in protecting these systems from cyberattacks while ensuring the dependability of critical infrastructure.
AI and machine learning are also increasingly making headlines in the future of SCADA systems. AI algorithms can read vast volumes of data generated by SCADA systems to identify trends, predict when a piece of equipment needs to be serviced, and optimize all related processes. AI-powered predictive analytics can help prevent equipment failures, minimize time loss, and enhance system efficiency. Thus, AI in SCADA has marked a significant milestone in managing industrial processes more proactively, intelligently, and streamlined.
The trend toward edge computing impacts SCADA systems. Edge computing is a form of data processing closer to the source rather than being sent to the centralized cloud or data center. Since this reduces latency and improves response times, it also reduces the amount of data needing to be transmitted over networks. This can enhance SCADA's real-time monitoring and control, making management decisions more efficient. ...Read more
The demand for precise material characterization drives the growth of nanoparticle and microparticle measuring equipment in various industries. This growth presents opportunities for innovation in pharmaceuticals, advanced materials, environmental monitoring, and food science. Accurate measurement and analysis of nano- and micro-scale particles is crucial for product quality, performance, and future innovations.
Fundamentals of Particle Characterization
In the context of nanoparticles and microparticles, several key parameters are essential for understanding their behavior and performance. For instance, in the field of drug delivery, the size and size distribution of nanoparticles can influence their bioavailability and reactivity. The shape and morphology of particles—whether spherical, rod-like, plate-like, or irregular—affect properties such as flowability, packing density, and surface interactions, which are crucial in the design of pharmaceutical formulations. Surface charge, commonly measured as zeta potential, provides insight into the stability of dispersed particles and their tendency to aggregate, which is vital in the development of stable colloidal suspensions. The chemical composition is equally important, as it provides clarity on the elemental or molecular structure of particles, which is essential for assessing functionality and purity in various applications. Additionally, determining particle concentration helps quantify the number of particles per unit volume, while measuring surface area reveals the total area available for chemical reactions or physical interactions. Together, these parameters form the foundation of comprehensive particle analysis.
Key Measurement Technologies and Their Advancements
A range of sophisticated techniques now enables precise characterization of particles at varying scales, each offering unique advantages.
Nanoparticle Tracking Analysis (NTA) provides a complementary approach, offering real-time visualization and tracking of individual nanoparticles. It calculates hydrodynamic size and concentration on a particle-by-particle basis. Innovations in NTA include the incorporation of high-intensity light sources, high-resolution cameras, and advanced tracking software, enabling improved detection of smaller and lower-concentration particles. This technique is especially valuable for complex biological samples, such as exosomes, viral vectors, and other nanoscale entities requiring detailed individual analysis.
Laser Diffraction (LD) is widely used for sizing particles ranging from sub-microns to millimeters. It determines particle size based on the angle and intensity of light scattered by particles in suspension or dry form. Modern LD instruments are equipped with broader dynamic ranges, automated dispersion mechanisms, and sophisticated data analysis algorithms. The integration of AI and machine learning is further enhancing the accuracy of interpretation, especially in complex or polydisperse samples. These technologies are beneficial in handling the large volumes of data generated by LD, improving the accuracy and speed of particle size analysis.
Imaging techniques, such as Transmission Electron Microscopy (TEM), Scanning Electron Microscopy (SEM), and Atomic Force Microscopy (AFM), provide direct, high-resolution visualization of particle morphology, size, and surface characteristics. Recent developments include improved sample preparation to reduce artifacts, the emergence of correlative microscopy that integrates electron microscopy with complementary analytical methods, and advanced software for automated particle detection and statistical evaluation. Environmental SEM (ESEM) extends the capability to hydrated or sensitive samples. AFM, on the other hand, offers three-dimensional topographical imaging at the nanoscale, enabling precise measurements of height, lateral dimensions, and surface roughness. Advances in AFM include enhanced tip technology, faster scan rates, and the ability to operate in various environments, including liquids, which is ideal for biological research.
Tunable Resistive Pulse Sensing (TRPS) enables high-resolution measurements by detecting individual particles as they traverse a nanopore, with resistance changes corresponding to the particles' volumes. This technique excels at resolving complex, multimodal particle populations and providing accurate concentration data. Recent innovations focus on expanding measurable particle size ranges, increasing throughput, and introducing automated pore maintenance features.
Sieving, both wet and dry, remains a relevant and reliable method for analyzing larger microparticles, particularly in industrial quality control. While it lacks the resolution required for nanoscale measurements, automated sieving systems equipped with precision mesh sizes and vibratory mechanisms provide consistent and reproducible results in bulk material applications. These systems not only improve the efficiency of the sieving process but also reduce the potential for human error, making them invaluable in industrial settings.
Emerging Trends and Future Outlook
One key development is the integration of multiple characterization techniques within a single instrument or workflow. This integration provides a more comprehensive and accurate understanding of particle properties. This advancement enlightens researchers, enhancing their knowledge and understanding of particle behavior.
Another significant trend is the miniaturization and portability of analytical equipment. Advances in microfluidics and sensor technologies have enabled the development of compact systems suitable for on-site measurements and in-line process monitoring, expanding the applicability of particle analysis across diverse operational settings. Moreover, the integration of advanced data analytics, including artificial intelligence (AI) and machine learning, is redefining the way data is interpreted. These tools are not only enhancing the accuracy of data analysis but also inspiring a new era of predictive modeling and optimization of experimental parameters, exciting the audience about the future of particle analysis.
Real-time and in-line monitoring capabilities are also gaining traction, particularly in industrial manufacturing contexts. Such systems provide immediate feedback and facilitate real-time adjustments, leading to improved process control, reduced material waste, and enhanced product quality. Furthermore, there is a growing focus on environmental and biological applications, such as the detection of microplastics and the characterization of drug delivery systems or viral particles. These complex samples require the development of specialized instruments and tailored methodologies, which in turn can lead to significant advancements in environmental protection, healthcare, and pharmaceutical research.
The continued innovation in nanoparticle and microparticle measuring equipment is crucial for scientific discovery and industrial advancement. As the understanding and manipulation of materials at the nanoscale and microscale continue to expand, the demand for more precise, efficient, and versatile characterization tools will only intensify. ...Read more
Haptic solutions, which mimic real-world touch sensations, are revolutionizing industries like VR, healthcare, and consumer electronics by providing tactile feedback. The demand for enhanced interactivity drives the development of advanced haptic devices like gloves, vests, and controllers, offering a more realistic experience.
The trend is particularly impactful in industries like education, where haptics in VR simulations can replicate hands-on experiences, such as medical procedures or mechanical repairs, without real-world risks. The miniaturization of haptic technology is another emerging trend. The advancement enhances user convenience and broadens the scope of applications. For example, haptic feedback in smartwatches can deliver discrete notifications or guide users during fitness activities. Mobile gaming is leveraging haptic enhancements to provide players with tactile cues, enriching gameplay without adding bulk to devices.
In the automotive sector, haptic solutions are revolutionizing human-machine interfaces (HMIs). Touch-sensitive dashboards, steering wheels, and control panels equipped with haptic feedback improve driver interaction and safety by providing tactile responses to touch commands. It allows drivers to focus on the road without relying solely on visual feedback. Healthcare is another industry witnessing transformative applications of haptic solutions. Haptic technologies are used in telemedicine, physical therapy, and surgical training to simulate real-world touch sensations. The innovations are making healthcare more accessible and practical.
Developing multi-sensory haptic systems is a noteworthy trend aimed at creating more prosperous and more nuanced tactile experiences. Researchers are exploring combining haptics with audio and visual feedback for greater realism. For instance, synchronized haptic responses with sound and graphics can create a fully immersive experience in entertainment and gaming. In e-commerce, multi-sensory haptics can allow customers to "feel" textures and materials virtually, bridging the gap between online and in-store shopping experiences.
The adoption of piezoelectric and electroactive polymers is driving advancements in haptic technologies. These materials enable precise and efficient haptic feedback while being lightweight and energy-efficient. Their application ranges from flexible displays to medical devices, where fine-tuned tactile responses are essential. As material science continues to evolve, haptic solutions are becoming more versatile, durable, and cost-effective, paving the way for broader adoption across industries. For instance, smartphone haptics can adapt to user behavior, delivering customized feedback for notifications, gaming, or typing.
Personalized haptics enhances user satisfaction and engagement by providing each individual with a unique and intuitive experience. The industry addresses sustainability concerns while catering to the growing demand for green technologies. Haptic solutions are evolving rapidly, driven by trends such as VR integration, miniaturization, automotive applications, and advancements in healthcare. The focus on multi-sensory systems, innovative materials, personalization, and sustainability further underscores the transformative potential of haptic technologies. ...Read more