Deep Learning Prediction Models: Industry-specific Application Examples and Considerations for Adoption
November 26, 2024
This is some text inside of a div block.

Last summer, a research team from the Department of Earth and Environmental Engineering at UNIST announced the successful development of a deep learning prediction model that accurately analyzes typhoon information. This model combines geostationary meteorological satellite data and numerical model data to predict typhoon intensity in real time.

It is characterized by the fact that it can be applied to operational forecasting systems because it objectively extracts environmental factors that affect changes in typhoon intensity. It is expected to provide weather forecasters with more accurate forecast information, which will be of great help in disaster preparedness and damage prevention.

Building a deep learning prediction model can be an important turning point in a company's digital transformation. However, many small and medium-sized enterprises (SMEs) start projects without sufficient preparation, and then experience painful failures during the implementation process or have low utilization after implementation.

In this article, I will introduce real-world examples of deep learning predictive models applied to different industries, as well as a realistic roadmap to help you introduce predictive models with minimal risk.

Practical application of deep learning prediction model

Pharmaceuticals/Bio

With the recent development of artificial intelligence technology, the accuracy of demand forecasting has improved dramatically, and the management strategies of pharmaceutical companies are becoming more sophisticated.

Practical Application of Deep Learning Prediction Model - Pharmaceuticals/Biotechnology
A Blueprint for Defensible AI - IQVIA

IQVIA Solution Japan's case, a leading healthcare information service company in Japan, shows the successful introduction of AI-based demand forecasting. The company has introduced Brainpad's advanced AI sales forecasting model to systematically analyze the vast amount of performance data in the pharmaceutical market.

This has enabled pharmaceutical companies to make more sophisticated decisions. In particular, by providing customized predictive services that reflect the unique business environment and needs of each pharmaceutical company, we are creating real business value.

Meanwhile, AI-based demand forecasting is also leading innovative changes in the pharmaceutical distribution sector. Usacil has modernized the inventory management system of dispensing pharmacies with AI technology.

This system analyzes sales data from individual pharmacies in depth to identify demand patterns based on regional characteristics and customer composition. It also accurately predicts fluctuations in demand due to seasonal illnesses.

For example, it is now possible to efficiently manage inventory by identifying the optimal stock level of anti-allergy drugs in advance for the spring pollen allergy season.

The introduction of such AI-based demand forecasting systems is having a positive impact across the pharmaceutical industry. There are tangible results in various aspects, including reduced inventory costs, improved distribution efficiency, and increased customer satisfaction.

It is expected that the accuracy of demand forecasting will continue to improve as AI technology advances.

Consumer goods and food

The introduction of AI technology in the food industry is emerging as a key factor in determining a company's competitiveness.

Practical Application of Deep Learning Prediction Models - Consumer Goods and Food
Source: Startup Challenge Forum, Heodak “We will make the AI system available to everyone”

Since 2019, Heodak has built an AI-based demand forecasting system, which has led to innovation in production and logistics. Through the automatic analysis of sales data, production volume and inventory have been optimized, and through collaboration with Mesh Korea, the entire process from order aggregation to ordering and shipping has been automated.

The company aims to increase the current automation rate of 40% to 80-90%, and has already achieved 70% automation in the production process.

As a result, the management of 3 million products shipped per month can be operated by just two people, and sales have increased by more than 200% year-on-year to reach 34 billion won in 2020.

Practical Application of Deep Learning Prediction Models - Consumer Goods and Food
Source: Lotte Confectionery's 'Insider' Snack? 'Elsia' Made It: Naver Blog

Meanwhile, Lotte Confectionery has developed the LCIA trend prediction system through a two-year collaboration with IBM to build a more sophisticated demand forecasting system.

Based on IBM Watson Explorer, the system analyzes product DNA by classifying the taste, ingredients, texture, shape, size, and packaging of products into seven to eight categories. In particular, it adopts a self-learning method similar to AlphaGo, and its prediction accuracy improves over time.

Lcia comprehensively analyzes various data, including social data, POS sales data, weather information, consumption patterns by age, and consumption characteristics by region. This enables it to provide an estimated demand for eight weeks three months in advance and support the planning of ideal new products that match trends.

This scientific and objective decision-making support system is widely used not only for new product development but also for the establishment of production and sales strategies.

The cases of the two companies show how AI technology can change the future of the food industry. While the case of Heodak focuses on streamlining production and logistics operations, the case of Lotte Confectionery is differentiated by its focus on trend prediction and product development innovation.

Agriculture, Forestry and Fisheries

In the agriculture, forestry, and fisheries industries, the accuracy of demand forecasting and quality control is a key factor in the stability and profitability of the entire industry. Recent advances in AI technology are bringing about innovative changes in this field, and in particular, predictive models using deep learning technology are showing remarkable results.

A typical example is the onion supply and demand forecasting model. This model achieved a high prediction accuracy of 93% by using a Long Short-Term Memory (LSTM) network.

It enabled accurate supply and demand predictions by comprehensively analyzing various variables, including cultivation area, single-day water supply prediction, shipment volume, climate change factors, and farmgate selling prices. In particular, the reliability of the prediction was further enhanced by applying the directed acyclic graph (DAG) analysis method to precisely identify the correlations between each factor.

Practical application of deep learning prediction model - Agriculture, Forestry and Fisheries
source: A machine learning-based price state prediction model for agricultural commodities using external factors | Decisions in Economics and Finance

Meanwhile, the Agricultural Product Price Forecasting System is achieving remarkable results by applying a reinforcement learning model based on KAMIS agricultural product distribution information.

This system consists of an environmental, agent, and policy neural network, and provides accurate price forecasts by analyzing climate information, production information, distribution information, consumer intention index, etc. in a complex manner.

The use of AI technology is also expanding in the smart farm sector. In the case of tomato smart farms, the ConvLSTM deep learning technique was used to achieve a high accuracy of 0.981 in yield prediction and 0.805 in growth prediction.

By learning hourly patterns based on the NIA's smart farm big data and analyzing weekly and monthly data, it is now possible to accurately predict the growth status and yield of crops.

The introduction of this AI technology is contributing to reducing price volatility caused by supply and demand imbalances, increasing production efficiency, and improving the accuracy of quality control.

In particular, in the context of increasing climate change and market uncertainty, AI-based predictive models are becoming key tools for the sustainable development of agriculture, forestry, and fisheries.

Chemical

AI technology is at the center of the new changes facing the global chemical industry. Global chemical companies are making all-round innovations from R&D to production processes and quality control using AI and deep learning technologies.

Dow Chemicals, a US chemical company, is opening up new horizons in the chemical industry through a strategic partnership with 1QBit, a quantum computing company. In particular, it has dramatically shortened the R&D process by developing a machine learning-based predictive model in the field of materials science.

This is considered to be an example of maximizing the efficiency of new material development through digital simulation, breaking away from the traditional laboratory-centered research method.

Saudi Arabian global chemical company SABIC has been a pioneer in the adoption of AI technology in polymer modeling. The AI system precisely simulates complex chemical reactions to derive optimal production conditions.

In addition, SABIC is proactively responding to market changes through price and demand forecasting models, and has greatly improved operational efficiency by applying AI technology to supply chain management.

Practical application of deep learning prediction model - Chemistry
Source: WO2018004304A1 - Method for detecting foreign substances and apparatus and system for the same

The case of LG Chem, a Korean company, shows how AI technology can transform the manufacturing sites of the chemical industry. The introduction of AI technology into the petrochemical process has maximized production efficiency, and the foreign substance detection system has achieved an amazing accuracy of 99.75%.

Based on these achievements, the company is expanding the scope of application of AI technology to advanced materials and the life sciences.

The success stories of these companies show that AI technology is becoming a core competitive advantage in the chemical industry, rather than a simple auxiliary tool. AI technology is driving innovation in all areas of the chemical industry, including shortening the research and development period, improving production efficiency, and advancing quality control.

In particular, in the current industrial environment where environmental regulations are being tightened and sustainability is being emphasized, AI technology is emerging as a key solution to solve various challenges faced by chemical companies.

This change is redefining the future of the chemical industry. The introduction of AI technology is going beyond a simple digital transformation and is fundamentally changing the paradigm of the chemical industry. It is expected that the chemical industry will continue to evolve into a more intelligent and efficient form with the advancement of AI technology.

Practical introduction/operation issues of deep learning prediction models

Quality control and acquisition of data for deep learning prediction model learning

Data quality management and acquisition for deep learning predictive model training
Machine Learning Overview (Source: Predicting Employment Through Machine Learning)

As companies accelerate their digital transformation, the adoption of deep learning predictive models is expanding. However, securing and managing high-quality data is a key factor in determining the success or failure of AI projects, so 80% of the AI model development process must be spent on data preparation.

Data quality issues directly affect the accuracy of predictive models. Incomplete or biased data can lead to incorrect predictions, which can seriously distort a company's decision-making.

In particular, the complexity of the data collection, refinement, and preprocessing processes is a major challenge faced by many companies.

To solve these problems, a systematic data management process must be established first. This means establishing a data governance system across the organization, rather than simply introducing a technical solution.

In other words, a system is needed that establishes quality control standards throughout the entire process from data collection to storage, processing, and analysis, and continuously monitors them.

Securing the diversity and representativeness of data is also an important task. Since the performance of a predictive model is largely dependent on the quality of the training data, it is essential to secure a variety of unbiased data samples.

To do this, it is necessary to utilize various data sources and properly consider the temporal and spatial scope of the data.

Continuous data quality management is essential for maintaining the long-term performance of predictive models. This refers to a comprehensive quality management system that includes regular data audits, quality metric monitoring, and anomaly detection.

In particular, the establishment of an automated quality management system is required for real-time data.

In conclusion, a strategic approach to data quality management is required for the successful introduction of deep learning predictive models. This is a task that requires significant initial investment and organizational effort, but it should be recognized that it is an essential investment for securing a company's digital competitiveness in the long term.

Deep Learning Prediction Model Performance and Generalization Issues

The practical application of deep learning prediction models is quite different from development in a laboratory environment. Even if a model performs well in a laboratory, it frequently fails to produce the expected results when applied to a real business environment.

Most of these problems are caused by overfitting. This is a phenomenon in which the model learns the characteristics of the training data in an overly detailed manner, resulting in a significant drop in the predictive performance on new data.

You should also be aware of the problem of data leakage. This refers to the risk of the model's actual performance being overestimated because future information that should not be used for actual prediction is included in the training data during the model development process.

A systematic verification process is required to resolve these issues. The stability of the model should be evaluated through cross-validation, and overfitting should be prevented by applying various normalization techniques.

Performance and generalization issues

In particular, it is essential to test various scenarios that take into account the actual business situation. The performance of the model must be verified under various conditions, including market volatility, seasonality, and special situations.

In addition, it is important to establish a system for continuous monitoring and relearning of models. The business environment is constantly changing, and the performance of models may decline over time. Therefore, periodic evaluation of model performance and relearning as needed should be performed.

This is a management task that requires continuous investment and attention from the organization, rather than a simple technical challenge. To overcome this, close cooperation between the business department and the data analysis team is essential.

Technical infrastructure issues

Securing the right infrastructure is a fundamental prerequisite for building advanced predictive systems. In particular, deep learning projects that require large-scale computation require a significant level of computing power, which is why there comes a time when new levels of investment decisions are needed.

Modern advanced analytics systems require a lot of processing power. In particular, environments that require real-time decision-making require even more powerful computing power, which is an important consideration for companies when budgeting.

In addition, continuous investment in maintenance and scalability is inevitable after the system is built.

In this situation, the strategic use of cloud services can be a solution. This is because it reduces initial construction costs and allows resources to be flexibly secured as needed. In particular, for projects with large fluctuations in demand, the cloud method can bring significant cost savings.

Meanwhile, securing your own computing equipment is also an important option. Building a GPU or CPU cluster can be more economical in the long run and is also advantageous in terms of security and data sovereignty. However, in this case, initial capital investment and securing specialized personnel must be preceded.

Recently, the hybrid method has been attracting attention. The core analysis is performed on the company's own servers, and the cloud is used only during peak seasons or special situations. This allows for a balance between cost and efficiency.

Ultimately, infrastructure construction should be a strategic decision that takes into account the size, purpose, and budget of the company. Excessive investment can hinder profitability, and insufficient investment can limit the performance of the project. Therefore, a step-by-step and systematic approach is required from a long-term perspective.

Realistic Deep Learning Prediction Model Implementation Roadmap

Step-by-Step Introduction Strategy

Step-by-step introduction strategy
source: Deep Learning Roadmap: A Structured Roadmap for Mastery - GeeksforGeeks

The introduction of a deep learning prediction model is an enterprise-wide innovation project that goes beyond the implementation of a simple technology. A systematic and step-by-step approach is essential for its successful introduction, and clear goals and strategies must be established for each step.

The first step, the preparatory stage, is all about setting a clear vision and goals. There should be a clear purpose for solving specific business problems, not vague expectations or following market trends. You should carefully analyze the expected effects of the investment and establish a data infrastructure plan to achieve them.

Sufficient preparation at this stage greatly increases the chances of future project success.

In the early model development stage, the focus should be on rapid verification rather than perfection. It is desirable to quickly check the feasibility of the project through a simple baseline model and to make gradual improvements based on this.

The insights gained in this process will serve as an important foundation for the subsequent model enhancement.

In the model advancement stage, domain knowledge from experts in the field plays a key role. It is necessary to discover meaningful characteristics through close collaboration between the technology team and the field department and effectively reflect them in the model.

In this stage, it is important to focus on securing the stability and reliability of the model rather than rushing to draw conclusions.

The operation and monitoring stage is an important process that determines the sustainability of the project. An automated MLOps system should be established to continuously monitor the performance of the model and make immediate improvements as needed.

This should not be a simple technical task, but should be established as a core operational process of the organization.

Finally, it is necessary to improve AI literacy across the organization and create a culture of data-driven decision-making. This is a task that cannot be accomplished in a short period of time, so it should be accompanied by continuous training and change management.

A phased approach helps to spread out the initial investment costs and control risks to a manageable level. In addition, the experience of success at each stage can boost the organization's confidence and become a driving force for greater innovation.

The important thing is to recognize that all of this is a continuous process, and each step is the foundation for the success of the next step.

Essential prerequisites

Required Prerequisites
Deepflow Application Examples - Productivity Effects by Company

For the successful implementation of a deep learning predictive model, the first priority is to set clear business goals. It should not be simply introduced for the sake of introducing the technology, but should have the purpose of solving specific business problems.

For example, measurable goals should be established, such as “improving demand forecast accuracy by 15%” and “reducing inventory costs by 20%.” This clear goal setting will provide direction for the project and serve as a basis for evaluating the effectiveness of the investment.

Another key factor that determines the success or failure of a project is the establishment of a data infrastructure. A collection system for securing high-quality data and a quality control process that ensures the accuracy and consistency of data are required.

In particular, small and medium-sized enterprises should focus on the qualitative aspects of data so that they can build effective models with limited data. Along with the establishment of a data governance system, measures to protect data security and privacy must also be considered.

In addition, a precise definition of the business scenario is required. The underlying causes of demand fluctuations must be analyzed and the key variables affecting the forecast must be identified. In this process, the domain knowledge of the business experts is very important.

For example, explanatory factors reflecting business characteristics such as seasonality, promotion effects, and competitor trends must be selected and their interactions must be understood.

An objective assessment of technical capabilities and preparation are also important. Internal AI/data analysis capabilities should be assessed, and any deficiencies should be addressed through recruitment or external collaboration.

Rather than internalizing all capabilities, a strategic partnership approach may be more effective. At the same time, training programs should be prepared to improve the AI literacy of existing employees.

For successful implementation, it is important to establish a gradual approach strategy. Rather than a full-scale introduction that requires large-scale investment, it is preferable to verify through a small-scale pilot project and gradually expand.

This approach minimizes risk and is a realistic strategy that takes into account the learning curve of the organization. Insights and experience gained from the initial model become valuable assets in the process of upgrading the model later on.


In the process of preparing to build a deep learning prediction model, there is one principle that applies to all companies, regardless of their size or industry. Rather than focusing solely on securing the quantity of data, it is more important to focus on the quality of the data, and a model design that fully reflects the business context is required.

In addition, the model's prediction results should be oriented towards explainable AI so that the people in charge of the field can understand and trust them, and a system for continuous learning and improvement should be in place. When sufficient time and resources are invested to systematically prepare for these tasks, an effective deep learning prediction model can be built.

연관 아티클