| Javier García Martínez
Professor of Chemistry and Director, Molecular Nanotechnology Lab, University of Alicante
| Joseph Costantine
Associate Professor of Electrical and Computer Engineering, American University of Beirut
From rollable computer screens to “smart” clothing, the future of electronics looks to be increasingly flexible. The rapidly escalating development of wearable devices, flexible electronics and bendable displays demands power sources that match the agility of these systems. Standard, rigid batteries may soon be a thing of the past as thin, flexible batteries – made of lightweight materials that can be easily twisted, bent or stretched – reach the market.
Discover expert analysis related to flexible batteries on the Strategic Intelligence Platform.
Several types of flexible batteries are currently available. These batteries are rechargeable and include lithiumion or zinc-carbon systems placed on conductive polymer current collectors. In some cases, additives enhance conductivity and flexibility.1 The electrodes of flexible batteries can be coated with – or even printed onto – flexible substrates, including carbon-based materials like graphene, carbon fibres or cloth.
Flexible batteries have applications in a growing number of fields, including wearable medical devices and biomedical sensors, flexible displays and smartwatches. Health-related applications powered by these batteries could transmit data wirelessly to healthcare providers, facilitating remote patient monitoring. Further, flexible batteries that can be integrated into the fabric of jackets, shirts or other apparel will be required to power emerging textile-based electronics with capabilities ranging from built-in heating systems to health monitoring.
The flexible battery market is expected to expand rapidly in the coming years. One study forecasts that the global flexible battery market will grow by $240.47 million from 2022-2027, accelerating at a compound annual growth rate of 22.79% during this period.2 The primary drivers of growth are expected to be the increasing demand for wearable devices and the growing trend towards miniaturization and flexibility of electronics.
Several companies are actively developing and commercializing flexible battery technology, including LG Chem, Samsung SDI, Apple, Nokia, Front Edge Technology, STMicroelectronics, Blue Spark Technologies and Fullriver Battery New Technology.3 However, there is still room for innovation in this space, and new players are likely to enter the market as the technology evolves. The ability of flexible batteries to be bent, twisted and stretched makes them ideal for use in wearable devices. As the market demand for wearable technologies continues to grow, the future of flexible batteries is promising, and further advances are likely. As with all batteries, one hurdle to overcome is their safe disposal and recycling, which should come as the technology and associated applications become circular. Revolutionary advances in flexible-battery technologies and their accompanying industries are expected to continue for many years to come.
| Olga Fink
Professor of Intelligent Maintenance and Operations Systems, EPFL
| Julien Weissenberg
Founder, Deep Tech Experts
Generative artificial intelligence (AI) is a powerful type of AI that can create new and original content by learning patterns in data, using complex algorithms and methods of learning inspired by the human brain. While generative AI is still currently focused on producing text, computer programming, images and sound,4 this technology could be applied to a range of purposes, including drug design, architecture and engineering.
Discover expert analysis related to generative artificial intelligence on the Strategic Intelligence Platform.
For example, at the time of this writing, initial work has been published on generating candidate drug molecules targeting particular conditions5 and on creating pictures of imaginary buildings or on generating interior design. NASA engineers are currently working towards AI systems that can construct lightweight spaceflight instruments, achieving a 10-fold reduction in development time while simultaneously improving structural performance.6 Generative AI technologies may even impact the food industry and the design of everyday objects, from furniture to appliances. In scientific research, generative models could facilitate breakthroughs by improving experimental design, identifying relationships between data elements and creating new theories. For example, recently developed AI algorithms can translate a mathematical formula into plain English or analyse brain activity data to generate drawings of the objects that human participants are holding in mind.7,8
High school and university students are using generative AI more frequently, with some institutions forbidding their use while others are integrating generative models into teaching practices9 or even training students to master these tools. Used properly, generative AI can create personalized curricula that adjust to student skills and learning progress while encouraging critical thinking, igniting creativity and harnessing novel ideas.
In the workplace, the use of AI-based language models like the recently popular ChatGPT or its successors can increase productivity and improve output quality, restructuring human tasks towards idea generation and editing as opposed to rough drafting.10 Generative AI technologies specifically benefit low-ability workers and can increase job satisfaction and self-efficacy. Given the potential for productivity gains resulting from adopting these new technologies, it’s crucial to acknowledge the likelihood of job displacement. As such, policies and programmes that support workers in their efforts to upskill and reskill are essential in ensuring that the benefits of technological innovation are widely shared and that workers are equipped with the skills needed to thrive in the changing job market.
The newest developments involve autonomous AI systems that can make important decisions or take significant actions. For instance, AutoGPT is an autonomous AI application using the GPT-4 language model. AutoGPT can automatically accomplish a user-identified goal by dividing the goal into smaller tasks and employing tools like internet searches or text-to-speech technology. The growing integration of generative AI technologies, particularly autonomous AI, into multiple aspects of people’s daily lives, is generating both public excitement and concern.
To build public trust in generative AI, applications should meet agreed-upon professional and ethical standards. Generative AI systems represent the data they were trained on and the conventions governing society at that time. Care should be taken to mitigate AI bias based on training data, with a focus on including “outlier” data and novel societal conventions. Further, the decision-making processes of an application should be easy to understand, an application’s goals should be clearly disclosed to operators and end users, and individual privacy should be respected. Ethical guidelines and governance structures must be developed to mitigate potential harm and ensure that technical progress is balanced with responsible use. Finally, copyright attribution must be addressed so that proper credit is given to AI designers, creators of training data and authors of instructions for using the applications.
With the correct controls in place, generative AI can provide more time for creativity, demonstrate the boundaries of knowledge, and act as a sparring partner to challenge conventional thinking.
| Mariette DiChristina
Dean and Professor of the Practice in Journalism, Boston University College of Communication
| Lee Sang-Yup
Senior Vice-President for Research, Korea Advanced Institute of Science and Technology
| Lauren Uppink Calderwood
Head, Climate Strategy, Centre for Nature and Climate, World Economic Forum
Aviation accounts for 2-3% of global CO2 emissions annually, with concerning “business-as-usual” projected emissions of 39 gigatonnes between 2022-2050.11,12 While the use of electric vehicles for ground transport is rapidly increasing, the aviation sector has struggled with decarbonization because energy-dense fuels are required for long-distance flights. Additionally, the high price of replacing aircraft means that the current fleet will remain in operation for decades, and electric or hydrogen-fuelled planes may not be viable for long-distance flight in any case.
Discover expert analysis related to sustainable aviation fuel on the Strategic Intelligence Platform.
Enter a solution that does not require large scale changes to current aviation infrastructure and equipment: sustainable aviation fuel (SAF), produced from biological (e.g. biomass) and nonbiological (e.g. CO2 ) resources. Combined with other decarbonization strategies, including systemwide operational efficiencies, new technologies and carbon offsets, SAF should move the airline industry towards reaching net-zero carbon emissions in the coming decades.
Today, SAF makes up less than 1% of global jet fuel demand, but this must increase to 13-15% by 2040 to put the aviation industry on the path to net zero by 2050.13 Such an increase will require the creation of 300-400 new SAF plants; and airlines, manufacturers and fuel companies are working around the clock to enable this level of scale.
Fortunately, the production of SAF from biogenic raw materials using renewable energy is steadily increasing. According to the International Air Transport Association, SAF production reached at least 300 million (optimistically 450 million) litres in 2022, nearly triple that produced in 2021.14 An increasing number of airlines have committed to using SAF, a trend that will be accelerated through global efforts such as the World Economic Forum’s Clean Skies for Tomorrow initiative15 and First Movers Coalition.
The American Society of Testing and Materials (ASTM) has approved nine SAFs for blending at a ratio of up to 50% with conventional petroleum based jet fuel.16 The first SAF, approved by ASTM in 2009, is produced by converting syngas (a mixture of carbon monoxide and hydrogen) into hydrocarbons through a series of chemical reactions. Syngas can be prepared from biomass or wastes or, better yet, from captured CO2 and green hydrogen using renewable energy.
The second SAF, approved in 2011, is produced from plant oil and animal fat. The availability and collection of raw materials, along with the need for sustainably produced green hydrogen, remain major challenges for this option. Metabolically engineered microorganisms that can break down abundant, non-edible biomass could potentially reduce dependence on plant oils and animal fats.17
Over the past several years, seven more SAFs have been approved, with other exciting candidates still in active development. One example uses engineered bacteria to improve the SAF’s energy profile.18 In 2023, a consortium of actors in the United Kingdom is poised to deliver the first netzero transatlantic flight using solely sustainable aviation fuel, demonstrating the potential of this rapidly evolving technology and moving the world closer to net-zero aviation.
| Mine Orlu
Professor of Pharmaceutics, University College London (UCL) School of Pharmacy, Faculty of Life Sciences, UCL
| Wilfried Weber
Scientific Director, Leibniz Institute for New Materials
The number of microbes living on and within the human body matches, and may even exceed, the number of human cells.19 The community of microbes an organism harbours is called its microbiome, and the microbiomes of humans, animals and plants play important roles in the health of these organisms.20,21
Discover expert analysis related to designer phages on the Strategic Intelligence Platform.
Recent advances allow engineering of the microbiome to benefit human well-being and agricultural productivity. Key to this engineering are phages – viruses that selectively infect specific types of bacteria. Upon infection, a phage injects its genetic information into the bacterium. Using synthetic biology tools, the genetic information of phages can be reprogrammed so that infected bacteria execute a bioengineered set of genetic instructions. With bioengineered phages, scientists can change a bacterium’s functions, causing it to produce a therapeutic molecule or to become sensitive to a certain drug, for example. As phages generally only infect one type of bacteria, individual bacterial species within the complex microbiome can be targeted.
Designer phages are showing potential for treating microbiome-associated diseases such as hemolytic uremic syndrome (HUS) – a rare but serious condition that affects the kidneys and blood-clotting functions, caused by a certain species of E. coli. Scientists engineered the genetic material of an E. coli-infecting phage to encode genetic “scissors” that can chop up the E. coli genes that lead to HUS. Animal studies demonstrated that administration of these designer phages significantly reduced the presence of the HUS-causing strain of E. coli in the microbiome and alleviated HUS symptoms.22 This approach was recently granted an orphan drug designation by the U.S. Food and Drug Administration, poising it for clinical trials.23 Phages are also being designed as feed supplements to enhance the growth of livestock, treat certain plant diseases and eliminate dangerous bacteria in food supply chains, in alignment with the World Health Organization’s “One Health” approach.
Promising early results of designer phage therapies are attracting significant venture capital that will help to facilitate clinical testing of engineered phages. Potential applications of designer phages are numerous and diverse. Locus Biosciences is using engineered phages to combat antibiotic-resistant bacteria, whereas Eligo Biosciences is pursuing similar approaches to make certain bacteria less pathogenic. Of the 44 phage-related clinical trials with therapeutic intent, 29 have been posted since the beginning of 2020.24 Phage-based therapies involving both natural and designer phages will continue to emerge as a powerful method to engineer microbiomes, enhancing the health of humans, animals and plants.
| Corinna Lathan
Co-Founder and former Chief Executive Officer, AnthroTronix
| Geoffrey Ling
Professor of Neurology, Johns Hopkins Hospital
The Surgeon General of the United States recently declared war on what he calls “one of the country’s most pressing public health issues of our time”. Excess screen time and social media can decrease psychological well-being,25 but they can also enhance well-being when used responsibly.26 Screen time spent building connections in shared virtual spaces might help combat the growing mental health crisis as opposed to contributing to it.
Discover expert analysis related to metaverse for mental health on the Strategic Intelligence Platform.
Virtual shared spaces are digital environments where people can interact professionally and socially. The future of these spaces is commonly referred to as the metaverse, which may include virtual shared spaces enhanced with augmented or virtual reality (AR/VR). Just as multiple shared virtual platforms currently exist, there will likely be multiple metaverses, differing in purpose and level of immersiveness.
The mental health crisis that existed prior to the COVID-19 pandemic has since increased to unprecedented levels,27 making conditions ripe for metaverse-enabled mental health treatment. The number of mental health providers is insufficient to meet the escalating crisis,28 and, in the United States, a federal reimbursement opportunity for tele-mental health services is in the works to combat this shortage.29 Ideally, a mental health-centred technology-based infrastructure will support all aspects of mental health: prevention, diagnostics, therapy, education and research.
Gaming platforms are already being leveraged for mental health treatment. Such platforms not only increase patient engagement but also help destigmatize mental health issues. For example, DeepWell Therapeutics has created video games to treat depression and anxiety; UK-based Xbox studio Ninja Theory has incorporated mental health awareness into mass-market games and plans to expand into treatment with their Insight Project; and TRIPP has created Mindful Metaverse, which enhances well-being through VR-enabled guided mindfulness and meditation.30
Maturing interface technologies could further augment social and emotional connections between distant participants. For example, Emerge Wave 1 is a tabletop device that uses ultrasonic waves to simulate touch, enhancing users’ social experience. Noninvasive neurotechnologies can even provide feedback attuned to a user’s emotional state. For example, Neurable headsets use electrodes to measure emotion and can adjust music accordingly. Eventually, the metaverse will also connect to therapeutic neurotechnologies, such as direct brain stimulation to treat intractable depression.31
Leveraging the metaverse for the continuum of mental healthcare needs could be a win-win. Not only would patients benefit, but grounding the metaverse in a practical, necessary application could drive the emergence of this advancing virtual space.
| Rona Chandrawati
Associate Professor, University of New South Wales
| Carlo Ratti
Director, Massachusetts Institute of Technology (MIT) Senseable City Lab
The United Nations Food and Agriculture Organization states that world food production will need to increase by 70% to feed the world’s population in 2050.32 Technological innovations in agriculture will be a key step towards meeting this dramatic escalation and improving the world’s food security.
Discover expert analysis related to wearable plant sensors on the Strategic Intelligence Platform.
Traditionally, crops have been monitored via soil testing and visual inspections, both of which are expensive and time-consuming. Recent technological advances have improved the ease of crop monitoring, enabling farmers to monitor crop conditions at a larger scale. For several years, the health of farmland has been monitored using low resolution satellite data.33 Now, sensor-equipped drones and tractors are providing higher-resolution information about crop conditions.34,35 Resultant information from all forms of monitoring can be processed using AI. The next frontier in crop monitoring is even higher resolution: the monitoring of individual plants.
Wearable plant sensors promise to improve plant health and increase agricultural productivity. These sensors are small, non-invasive devices that can be attached to crop plants for continuous monitoring of temperature, humidity, moisture and nutrient levels. Data from plant sensors can optimize yields, reduce water, fertilizer and pesticide use, and detect early signs of disease.
Two companies, Growvera and Phytech, have independently developed micro-sized needle sensors that insert into a plant’s leaves or stems to measure changes in electrical resistance. Data are transmitted wirelessly to a computer or mobile device, where they are analysed to generate insights about plant health. Farmers can thus monitor crops in real time and perform precise interventions based on the specific demands of plants, such as adjusting irrigation or fertilizer application in response to moisture levels or nutrient data.
Much work remains. Wearable sensors can be expensive to install and maintain, and interpretation of sensor data may require specialized expertise. Improved data analytics tools are needed to help farmers make informed decisions about crop management from sensor data. The long-term effects of wearable sensors on plant growth and development also warrant investigation.
Despite these challenges, wearable plant sensors are poised to revolutionize crop production and management. By providing real-time data about plant health and environmental conditions, these devices can help farmers optimize agricultural productivity, reduce waste and minimize agriculture’s environmental impact – all while helping to feed the world’s growing population.
| Elizabeth O’Day
Chief Executive Officer and Founder, Olaris
| Angela Ruohao Wu
Associate Professor, Hong Kong University of Science and Technology
| Xu Xun
Director, BGI Research
The human body is composed of approximately 37.2 trillion cells. How do they all work together to keep us alive and healthy? Spatial omics may provide researchers with an answer. By combining advanced imaging techniques with the specificity and resolution of DNA sequencing, this emerging method enables the mapping of the what, where and when of biological processes at the molecular level. Starting with an organ of interest (such as a mouse brain), scientists slice tissue into sections only one cell thick. Innovative techniques are then used to visualize the locations of specific biomolecules in each slice. 36,37,38 Spatial omics allows previously unobservable cell architecture and biological events to be viewed in unprecedented detail.
Discover expert analysis related to spatial omics on the Strategic Intelligence Platform.
A new generation of molecular-level “cell atlases” are under development thanks to spatial omics, detailing the myriad biological processes occurring in humans and other species. 39,40 For example, using spatial omics, scientists constructed a three-dimensional cell atlas of fruit fly larvae and unlocked the black box of organ development in mouse embryos.41,42,43 Another study revealed that the injured amphibian axolotl brain heals itself using mechanisms mirroring those activated during brain development. 44 Spatial omics also shows promise in therapeutic discovery. Using this technique, scientists identified a population of neurons in the spinal cord that appears to be responsible for recovery after spinal cord injury. Stimulating these neurons in paralysed mice sped up their recovery to walking. 45 Additional health-related applications include characterizing the various cell types in a tumour to customize treatment and unravelling the mechanisms of complex diseases like Alzheimer’s disease and rheumatoid arthritis. 46 Infectious diseases can also be investigated using spatial omics. For example, a spatial omics study of samples from people who died from COVID-19 revealed that SARS-CoV-2 causes widespread disruption of cellular pathways across all tissues. 47
The need to democratize and scale up spatial omics technologies is pressing. With a total market value of $232.6 million in 2021 and an estimated revenue of $587.2 million in 2030, a growing list of public and private companies are seeking to provide spatial omics solutions. 48 While academic and translational research centres made up 89% of the market in 2020, 49 the market is dramatically expanding to include pharmaceutical and biotech industries.
To realize the full promise of spatial omics, technical challenges around data acquisition, processing, storage and standardized reporting must be addressed. Further, applications should be expanded to map other biomolecules, such as metabolites, and other organisms, including plants and invertebrates, to further illuminate the underlying biology. In the brief time since Nature Methods selected spatial omics as the method of the year in 2021, 50 it has evolved from a niche technique to one that is poised to become standardized and widely employed, revolutionizing the understanding of life.
| Wendy Ju
Associate Professor, Cornell Tech
| Geoffrey Ling
Professor of Neurology, Johns Hopkins Hospital
| Ruth Morgan
Vice-Dean (Interdisciplinary Entrepreneurship), Faculty of Engineering Sciences, UCL
| Angela Ruohao Wu
Associate Professor, Hong Kong University of Science and Technology
In recent years, brain-machine interfaces (BMIs) have gained visibility, igniting collective imaginations regarding the power and potential of one day controlling machines with thoughts. BMIs allow electrical signals the brain produces to be captured by sensor hardware. Algorithms then decode these electrical signals into instructions that a computer can understand and execute. BMI-like systems are already used to treat patients with epilepsy, and in neuroprosthetics – prosthetic limbs use electrodes to interface with the nervous system.51,52
Discover expert analysis related to flexible neural electronics on the Strategic Intelligence Platform.
Despite initial successes, there are challenges to these technologies. Current implants used by doctors are made of hard materials, like the chips inside a laptop or phone, and they can trigger longterm scarring and cause substantial discomfort. They cannot bend or adapt to brain movements so, over time, they “drift” in position, decreasing the accuracy of the captured signals. Non-invasive methods, like electrodes placed on the outside of the skull, do not require surgical implantation but provide only muffled, difficult-to-decode signals – like listening to a person talk through a thick face mask.
Researchers have recently developed brain interfacing circuits on biocompatible materials that are soft and flexible. Flexible circuits can conform to the brain, reducing scarring and sensor drift, and they can be packed with enough sensors to stimulate millions of brain cells at once, vastly outperforming the scale and timeframe of hard probes.53
When used in neuroscience research, flexible BMIs could deepen understanding of neurological conditions such as dementia and autism. In the clinic, flexible BMIs could provide greater control of neuroprosthetics without requiring frequent recalibration.54 Applications of flexible BMIs55,56 are already undergoing US Food and Drug Administration (FDA)-approved clinical trials, rapidly making this technology a reality. In the future, other implantable devices, such as cardiac pacemakers, could adopt similar types of materials.
Looking forward, advances in materials manufacturing and soft-circuit printing could further improve flexible BMI technologies, eventually leading to true human-AI interfacing. As with many emerging technologies, broad ethical issues must be considered prior to the wide implementation of these interfaces. Potential health outcomes must be balanced with public acceptance and trust. Further, given the sensitive nature of brain-derived data, privacy and ethical use guidelines must establish how these data can be used in the short, medium and long term.
| Olga Fink
Professor of Intelligent Maintenance and Operations Systems, EPFL
| Andrew Maynard
Professor of Advanced Technology Transitions, Arizona State University
While the Earth is indisputably facing a worsening environmental crisis, increasing reliance on data may not seem to play much of a role. Yet data centres, which facilitate Google searches, email, the metaverse, AI and myriad other aspects of an increasingly data-based society, consume an estimated 1% of the electricity produced globally,57 and this amount will only increase with growing demand for data services. While there is no single “green data” magic bullet, it is expected that the coming decade will boast substantial strides toward net-zero-energy data centres as emerging technologies are combined and integrated in innovative ways – rapidly making the dream of net-zero-energy data centres an achievable reality.
Discover expert analysis related to sustainable computing on the Strategic Intelligence Platform.
First, to address heat-management issues, liquid cooling systems are being developed that use water or dielectric coolant to dissipate heat, and excess heat is being repurposed for applications including space heating, water heating and industrial processes. For instance, the city of Stockholm is implementing projects to harness waste heat from data centres to heat homes.58
Second, AI is being used to analyse and optimize energy use in real-time, maximizing efficiency without compromising performance. DeepMind has successfully demonstrated the potential of AI-powered energy management, achieving up to a 40% reduction in energy consumption at Google’s data centres.59
Third, the technological infrastructure supporting net-zero-energy data centres is becoming more modular and demand-based. For instance, cloud and edge computing systems allow data processing and storage to be spread across multiple devices, systems and even locations.60,61 As an example, Crusoe Energy installs its modular data centres at sites where methane flaring occurs to enable cloud computing infrastructure to be powered by methane gas that would otherwise have been released directly into the atmosphere. These and other prefabricated units can be easily deployed, expanded or relocated, allowing data centre operators to optimize energy use and adapt to their companies’ changing needs. Additional innovations in software and hardware include novel computing architectures like systems on a chip;62 and optimizations such as energy-proportional computing, in which computers use energy proportional to the amount of work being performed.63
Achieving net-zero-energy data centres will involve innovative approaches to integrating the above mentioned approaches with new electricity generation, storage and management technologies. Given the wave of innovation and investment in this area, there is reason to be optimistic about the years ahead.
| Daniel E. Hurtado
Associate Professor, Pontifical Catholic University of Chile
| Andrew Maynard
Professor of Advanced Technology Transitions, Arizona State University
| Bernard S. Meyerson
Chief Innovation Officer Emeritus, IBM
| Mine Orlu
Professor of Pharmaceutics, UCL School of Pharmacy, Faculty of Life Sciences, UCL
| Landry Signe
Senior Fellow, Brookings Institution
The shortcomings of healthcare systems all over the world became abundantly and horrifyingly clear during the early days of the COVID-19 pandemic when the sustainable workloads of many hospitals were rapidly exceeded. In response, government-based and academic teams have been created to integrate AI and machine learning (ML) into healthcare – both to anticipate impending pandemics and to aid in effectively addressing them (AI4PEP).64,65 These emergent efforts to enhance the efficacy of national and global healthcare systems in the face of major health crises, and to democratize access to care, are in their initial stages but will rapidly scale up by integrating quality data into the AI and ML models.66
Discover expert analysis related to AI-facilitated healthcare on the Strategic Intelligence Platform.
AI-based technologies could also help to tackle a related challenge – the long delays many patients experience when attempting to obtain medical care through the healthcare system.67 Surprisingly, delays often arise not from a lack of capacity but due to uneven access to – and resultant underutilization of – existing facilities. When applied to a curated data set of existing medical facilities, AI, ML and data analytics, techniques dramatically improved patient access to treatments. Medical Confidence, a subsidiary of CloudMD, used such technology to optimally align patient treatment needs with facility availability, enabling dramatic reductions in treatment wait times – in some cases, from many months down to only weeks.68 An AI-based approach to optimizing access to care is becoming broadly adopted in Canada and will likely be replicated elsewhere.
The impact of AI-based healthcare could be even more profound in developing nations, which often lack the infrastructure and personnel to deliver health services to much of their populations. Intelligent tools to assist in the identification, monitoring and treatment of new or ongoing medical conditions – such as an AI-based system to facilitate the reading of radiological data69 – are a first step in leveraging AI and ML to enhance healthcare capabilities in locations where care is currently inadequate. India, for example, has a widely dispersed population of over 1.4 billion and has embraced an AI-based approach to enhance medical outreach. The Indian government has enabled physicians to engage remote communities through assistive technologies, with requisite privacy safeguards in place.
In addition to protecting data privacy and gathering quality data needed to generate these insights, other challenges to implementing AI-facilitated healthcare approaches include bolstering public acceptance and universal adoption of such technologies, assuring patient compliance and addressing possible national security concerns. While these remaining hurdles may be challenging to overcome, the risks of inaction are clear.
Moreover, any system that curates personal data on the health and welfare of a vast population must function within the bounds of a carefully crafted legal and ethical framework. Such considerations are already the topic of extensive discussion,70 and legal frameworks are beginning to emerge in anticipation of the global application of AI and ML to healthcare. AI-based healthcare solutions will become ever-more pervasive in the next three to five years, to the great benefit of human health – particularly for those in underserved populations.