The fast-paced age of computer connectivity is blurring the lines among physical, digital, and biological spheres. Tech is everything and tech is everywhere. Most organizations had somehow gone through a digital transformation journey. Today, almost every company is a technology company.
As automation technology matures and becomes an integral part of business operations, it’s time for organizations to pivot from process automation to intelligent automation, and from data-driven organizations to AI-powered organizations.
However, many organizations are still struggling with digital transformation to become data-driven. So, how should they approach this new challenge?
Computing technology
IoT (Internet of Things) is the source of big data; cloud computing facilitates the storage and processing of large data sets; AI (Artificial Intelligence) enables advanced analytics; ML (Machine Learning) learns and identifies data patterns and makes predictive analytics to perform operations without human intervention, and cognitive computing mimicking the function of the human brain helps to improve human decision making.
Cognitive computing is the next-generation information system that understands, reasons, learns, and interacts with the business ecosystem. It’s continually learning from past experience, building knowledge, understanding natural language, and reasoning and interacting more naturally with human beings than traditional programmable systems.
Cognitive computing is the third era of computing. We went from the first era with computers that tabulate sums (1900s) to the second era with programmable computer systems (1950s).
Data and analytics
Advancements in technology, especially in Data & Analytics, enable a range of unforeseen opportunities to amplify, automate and optimize the business operation and decision making.
After embarking on a digital transformation journey over the last few decades, data and analytics have become widespread, well understood, and used successfully in many organizations. Therefore, this would be a good starting point for organizations to embark on their AI transformation journey.
A recipe that goes from data and analytics to AI is a natural and pragmatic progression. Winning with Data, Analytics, and AI requires a holistic approach to data-driven, analytics-enabled, and AI-powered technology strategy.
Artificial intelligence
Artificial Intelligence (AI) seems to be the buzzword presenting both distracting hype and powerful opportunities to leap the business forward. Today, AI remains elusive, and misunderstood, and it has captured the imaginations of many.
What exactly is AI, how can we get there, and what are the opportunities, the challenges, and the benefits, in practical terms?
The exhibit below is the typical AI technology roadmap with its branches and approaches.
While data analysis is the process of turning raw data into clear, meaningful, and actionable insights, Artificial intelligence (AI) is a data science field that uses advanced algorithms to allow computers to learn on their own from experience, adjust to new inputs and perform human-like tasks. It seeks to mimic human abilities and simulate human intelligence in a machine.
Businesses produce massive amounts of data that are impossible for humans to keep up with. However, if we can analyze data by leveraging the power of artificial intelligence, then we can produce results far beyond what humans are capable of doing, in terms of speed, reliability, and accuracy. In other words, AI makes big data appear small. It automates and simplifies many human tasks.
AI is a broad field of study that includes many theories, methods, and technologies. The following are the major subsets of AI: (Artificial Intelligence > Machine Learning > Deep Learning.)
Machine learning
Machine learning is a subset of AI that trains a machine how to learn. It’s a data analysis method that automates the building of an analytical model and makes necessary adjustments to adapt to new scenarios independently. It uses methods from neural networks, statistics, and operations research to uncover hidden insights in data and develop pattern recognition capability that continuously learns from and makes predictions based on data. It continuously makes adjustments without being programmed and makes decisions with minimal human intervention.
In general, there are four methods of machine learning:
- Supervised learning works with labeled data sets and requires less training.
- Unsupervised learning classifies unlabeled data sets by identifying patterns and relationships.
- Semi-supervised learning uses a small labeled data set as a guide to classify a larger unlabeled data set.
- Reinforcement learning works on interacting with the environment and aims to maximize the rewards by their hit and trial actions.
The algorithm of machine learning are categorized based on the type of problem to be solved and the type of output to be generated. These algorithms are divided into three categories:
- Classification is a supervised machine learning algorithm. The classification algorithm helps to sort and classify our data into different pre-determined buckets.
- Clustering is an unsupervised machine learning algorithm. It’s used to group data points having similar attributes or characteristics into a cluster.
- Regression is a supervised machine learning algorithm. Regression is using an existing or past trend to predict an unknown value.
Deep learning
Deep learning is a subset of machine learning that is superior to the traditional machine learning approaches. It uses a combination of multi-layer artificial neural networks and data that train a computer to perform human-like tasks.
The deep learning model takes the advantage of advances in computing power and improved training techniques to learn complex patterns in large amounts of data that are typically unsupervised or semi-supervised. Some models are so effective that it has begun to surpass human abilities in many areas, such as voice and speech recognition, pattern or image recognition, and natural language processing.
Neural networks
As explained above, machine learning is a subset of artificial intelligence and deep learning is a subset of machine learning. The “deep” in deep learning is referring to the depth of layers in a neural network. To be precise, a neural network that consists of more than three layers, including the inputs and the output, can be considered a deep learning algorithm.
Neural networks mimic the human brain through a set of algorithms. It’s made up of interconnected units like neurons that process information by responding to external inputs and relaying information between each unit. This process requires multiple passes at the data to find connections and derive meaning from undefined data.
There are many technologies that enable and support the development of AI. Below are several of them.
Natural language processing
Natural language processing (NLP) is a branch of artificial intelligence that helps computers to analyze, understand, interpret and manipulate human language in the form of text and voice. NLP helps computers communicate with humans in their own language by making it possible for computers to read text, hear speech, interpret meaning, and also measure sentiment. NLP is entering the next level of development with natural language interaction that will enable humans to communicate with computers using everyday normal language to perform human tasks.
Computer vision
Computer vision is a field of artificial intelligence that trains computers to interpret and understand the visual world. It relies on pattern recognition and deep learning to recognize the contents in a picture or video. With the ability to accurately identify, classify, process, analyze and understand the objects, it can capture images or videos in real-time to interpret their surroundings and take appropriate actions accordingly. Today, computer vision rivals and surpasses human visual abilities in many areas.
The Internet of Things (IoT)
The exponential rise of the Internet of Things (IoT) with connected devices in every corner of business operation has generated massive amounts of data, but most of it’s unanalyzed and wasted. This opens a new frontier for organizations to have an IoT-enabled Enterprise Management System or Enterprise Digital Platform that will mine and unlock the value of data by leveraging AI technology. This digital platform provides the ability to extend real-time data collected from the distributed devices in the fields or shop floors to the C-Suite for operational and strategic decision-making.
Graphical Processing Unit (GPU)
Graphical processing units are the key enabler to the development of AI technology because they provide the heavy computing power that is needed for real-time iterative processing. Big data and computing power are required in neural network processing.
Application Programming Interfaces (API)
Application programming interfaces are portable packages of code that make it possible to add AI functionality to existing products and software packages. This is the open and modular approach in the modern software development environment.
AI transformation strategy
Just like digital transformation or any transformation project, the AI transformation is also less about the technology and more about the people and the strategy.
Human beings are incredible creatures with so many unique capabilities that no machine can replicate - empathy, enthusiasm, imagination, passion, creativity, flexibility, and inventiveness. Therefore, it’s critical to take a human-centered approach to AI transformation. The right AI transformation approach is for technology to adapt to people and strategy, not the other way around.
To achieve comprehensive and successful AI transformation, organizations must democratize AI by implementing no-code or low-code tools and platforms in order to bring the power of AI to the desktop of every employee. With access to AI as part of their everyday routine tasks, people in any function and position can get more things done and do things that were not possible before. They can find critical information, uncover hidden insights, automate repetitive tasks, improve collaboration, etc.
A successful AI transformation strategy must consider cultural issues as well as business issues. This requires a fundamental transformation in how things are done, how employees relate to each other, what are the skillset and mindset needed, what are the processes and guiding principles, etc.
It’s the people that make the difference. Data scientists and developers working in isolation often deliver models that lack business knowledge, purpose, or value. Similarly, business people working in isolation lack the technical knowledge to understand what can be done from AI and data science perspective. However, by enabling cross-functional teams and making those that know the business a central piece of your AI transformation process, we can create powerful and effective AI solutions.
Key takeaways
Artificial intelligence (AI) has made advancements that were unimaginable even just a few years ago. This cutting-edge technology has transformed from vision to reality, creating tangible benefits for people and organizations.
AI is clearly the defining and essential technology of our time. While the complexity of AI may seem daunting or intimidating, all we need is just need a high-level understanding of AI capabilities to capitalize on the opportunities and capture the values. Just like smartphones, we can operate them to make our work and life easier and more productive without the need to understand the technical details inside the device.
The goal of AI is not to replace humans anytime soon. It’s to provide an intelligent system that can reason on the input and explain the output. AI will provide human-like interactions and offer decision support for specific tasks.
As with all technological innovations, the adoption of AI technology will have broad positive and negative impacts on society, raising complex and challenging questions about the future we want to live in.
Last but not least ...
IMDA (Infocomm Media Development Authority) of Singapore believes that AI, Data, and Blockchain technologies will play an important part in ensuring the success of Cloud Native Architecture. These are promising technologies that have the potential to catalyze businesses' digital transformation.
I strongly recommend that you read my e-book on the next four digital technology revolutions powered by Blockchain or Distributed Ledger Technology (DLT).