Select your language

Article

Digital Technology Week: Debunking the myths of digital technology

Posted on the 29th April 2024

Digital Tech Week Myths

Paul Kuiken traces the origins of digital advancements and the need to comprehend terms like Industry 4.0, Industry 5.0, AI, and Digital Transformation.

With the ‘noise’ surrounding digital technology it is important that a baseline understanding is set.

To navigate this evolving landscape, it is crucial to understand these terms and their practical implications. 

Tracing the Origins

Many digital advancements find their roots in the manufacturing sector due to their structured and controlled processes. This sector's ability to gather substantial data and implement statistical control methods positioned it as a pioneer in the digital revolution. However, other industries have also been able to leverage repeatable, transactional, and data-driven processes such as financial services (“Fintech”).

General Terms and Definitions

Terms like Industry 4.0, Industry 5.0, AI, and Digital Transformation are often used interchangeably and usually contribute to confusion. Having reviewed several hundred published journal papers on the subject there is no, believe it or not, established, or definitive definition!

To navigate this evolving landscape, it is crucial to understand component elements, terms, and their practical implications. The following covers the main aspects of digital technology as a consensus of the published literature.

Understanding various technologies is key to credible discussions.

Digital Technology Terms

Cloud computing, for instance, revolutionized data storage, offering flexibility and scalability. Most individuals will be using cloud-based applications or even virtual desktop machines in their daily activities. The ability of providers to provide online services arguably drove Software as a Service (SaaS) which is a routine supply model. Long gone are the days of a server room with the crown jewels (hosted data) enclosed within.

Big data analytics was made possible through the accessibility of information through the cloud and is central to many subsequent applications and challenges described herein. As data availability continues through digitization and socialization

The Internet of Things (IoT) became a fashionable term for devices connected to the World Wide Web. The use cases have exploded in recent years with the classic example of refrigerators being able to link with supermarket ordering systems. Again familiarity, if not comfort, with internet-connected devices in the domestic situation has provided a greater acceptance of new technology albeit not completely. IoT offers devices to be connected and managed in a cloud-based fashion – a central heating or domestic lighting system for instance.

Machine learning is concerned with statistical algorithms that can deduct outcomes from data to produce three outcomes; a decision process for classification or prediction, an error function that evaluates a model prediction, and a model optimization process used in manufacturing systems.

Digital Twins and Simulation - Predictive analytics describes the process of using data to model scenarios in a detached manner from a specific system. Digital twins take real data to predict given circumstances. A practical application is one used in civil engineering in Norway to monitor bridge structures using input data from force-measuring devices. Digital Twins can model the data to examine the rate of structural decay. This proactive approach minimizes the risk of emergencies and provides a cost-effective solution in the long run.

Blockchain is what is known as a shared, distributed network that allows transactions to be assessed, verified, and tracked (the “blockchain”). As an automated process that is secure, transparent, and scalable, this has provided the basis for growth in financial services development, most notably cryptocurrency. However, the application is wider than that and lends itself to automated legal contracting, supply chain transactions, or asset transactions.

Artificial intelligence (AI) is possibly the most over-used and misunderstood term in use at the current time, not least due to different definitions in existence. Often used in a singular form, AI is a collective of technologies that are built on algorithms trained on a given (often ‘big’) dataset. AI requires programmed algorithms to create machine-learned models to create an output. Google Translate is one such learning (large language) model, albeit any form of (social) media has learning algorithms to suggest the next content consumption for a user. Netflix and YouTube operate in this manner. All require large datasets and cloud computing to operate. Hence AI is a collective of operators and not simply one thing as popular media would have us believe.

There are different types of AI in existence from the very basic to that of science fiction.

Reactive AI systems look for patterns in data or provide means of classification that can be useful in medical imaging diagnosis or more trivially in identifying music tracks from an audio file (e.g. Shazam).

Limited memory can handle more complex classification activity often based on unrelated pieces of data. This technology is used to make predictions or actions and is used in self-driving cars for example.

Artificial General Intelligence (AGI) or ‘self-aware’ AI is often portrayed by the media as a doomsday scenario, akin to the Terminator films, where computers can surpass human intelligence. In practice, this has not been achieved and is in a theoretical state.

We will explore some of the ethical considerations later, but there is a lot of concern about where the technology industry is heading, particularly as much of the development of the internet and associated tools has gone largely unregulated.

Other Considerations

Cybersecurity

The escalating threat in cybersecurity is evident in rising cybercrime statistics. With advancing technology also comes automated attacks on systems and domains which is the constant concern of chief technology officers. With increasing complexity in cybercrime comes the corresponding improvements in security systems. An example is Darktrace, a leading cybersecurity system, that tackles real-time threats through data analytics and machine learning (so-called AI).

Industry 5.0: The Human Element

Introduced by the European Commission in 2021, Industry 5.0 marks a shift towards human-centric aspects of digital technology, exploring its societal impact. It sparks a crucial debate about the future role of humans in a digitally evolving world.

Ethical Debate and Industry 5.0

As technology rapidly advances, the ethical debate intensifies. Questions about trust, appropriateness, security, and the future of jobs are at the forefront of media and government discussions.

This discourse is partly driven by Industry 5.0, emphasizing the essential interaction between humans and technology for effective implementation. What will the future of work look like? Is the concentration of power in tech companies a good thing? What sort of regulation does all this need?

In conclusion, the dynamic landscape of digital technology necessitates a continuous understanding of its evolving facets, ensuring informed decision-making and adaptation to future advancements.

Summary

This piece is designed to give the reader a high-level overview of digital technology terms and provide some insight into the current applications but also some of the potential across professional and personal lives. In the next article, we will look at the application of digital technology in pharmaceuticals as a precursor to that in regulatory affairs.

How we are addressing digital technology at G&L

At G&L Healthcare Advisors, we are committed to staying at the forefront of innovation in the provision of our industry-leading services.

We utilise regular learning sessions to navigate the evolving tech landscape, dispel myths, and foster a better understanding of its impact on regulatory affairs.

We have a dedicated technology group tasked with identifying, testing, and validating the use of digital tools that will help our staff and clients.

We are undertaking trials with technology specifically targeted at regulatory affairs applications. It is important to focus efforts on technology with impact rather than the next shiny new thing!

We will continue to educate our teams and clients alike as we believe that the narrative of technology in our profession should be developed.

Get involved!

We would like your thoughts and perspectives in this area. Are you keen to see regulatory affairs transformed by digital technology? Or are you skeptical of the perceived benefits? How do you believe the industry will change? For the better or worse?

For more information visit our website or arrange a call with one of our experts.

Paul Kuiken is Vice President of Advisory Practice at G&L Healthcare Advisors.   

Ready to reach your potential?

Discover how G&L can help you do business better

Contact Us