AI-based demand forecasting is becoming core business infrastructure beyond a simple trend. At the center of this transformation, how is ImpactiveAI's DeepFlow leading the market with its differentiated approach? From AutoML systems that automatically combine over 200 prediction models to pipelines that consistently process complex customer data, we explore the present and future of AI prediction technology through an in-depth interview with DeepFlow's Chief Technology Officer.
We had a clear goal from the initial prototyping of our AutoML system. Since each model has different prediction scenarios where it excels or struggles, we couldn't predict in advance which model would be suitable for data coming from various companies.
Therefore, we implemented all the methodologies we've researched as models and built a process to automatically experiment with them to find the optimal model for each company or specific product. The core of our AutoML is deriving the best results through this process.
Our approach is based on proven open-source models, but we combine and optimize them in unique ways. Rather than simply using existing models as-is, we create hybrid configurations that leverage each model's strengths and develop various models that focus on solving specific tasks (such as products with intermittent patterns). This is our key differentiating factor.
Through this systematic combination, we achieve prediction accuracy that would be difficult to attain with single models. While building on proven technology, we create substantial value through customer-specific optimization - this is the core of our approach.
Ensemble techniques have clear advantages in stability and accuracy compared to single models. While typically 10-20 models are combined, we can find optimal combinations for each client and product by testing more diverse scenarios.
However, this approach requires more sophisticated experimental design and sufficient computing resources. But we believe this is a necessary investment to achieve our goal of providing customers with the highest level of prediction accuracy. Ultimately, we're confident that customer-specific customization can create higher business value.
During prototyping, being new to the AI field, I thought that feeding lots of data into models and running experiments would always yield good results. But I concluded that this wasn't correct.
AI models are ultimately statistics-based, so if there are no specific patterns, prediction becomes difficult. This applies even to LLMs. The biggest mistake was the blind faith that putting in data would automatically produce results.
This is currently our biggest challenge in building the foundational core. Every time a new customer comes, it feels truly overwhelming.
B2B and B2C companies are different, and manufacturing companies each have distinct characteristics. For clothing, one product has various options, and some companies change product codes when composition or design changes even slightly. Accommodating these special customer-specific situations within a standardized structure is our most difficult challenge.
Initially, we standardized all customer data to our internal structure. But the structure suited for Company A didn't work for Company B. When we created a new standard to accommodate both A and B, Company A would need to request data changes or migration again.
When Company C was added, we had to migrate A, B, and C all over again, creating a vicious cycle that made us realize this approach wasn't appropriate.
Currently, rather than receiving standardized data, we accept customer data as-is and built a flexible layer that can transform it to fit our internal structure as needed.
Initially, we required specific column name formats, but now we accept customers' columns as-is and handle them ourselves. Internally, we built screens that can monitor data quality and statistical indicators in real-time, allowing us to identify data changes and gain insights needed for experiments before customers do.
The clothing company case is most memorable. Typically, prediction units and inventory management units (SKUs) are the same, but this company was different.
For knitwear products, they're divided into multiple SKUs by size or color, but predicting each individually left us with too little data to find patterns. Conversely, predicting the entire product group gave meaningful results, but SKU-level distribution was problematic.
We ultimately resolved this by consulting with the customer to distribute prediction values by SKU based on sales volume ratios from recent months. This process required significant changes to our data structure and processing systems.
We've actually encountered companies with no data management whatsoever. There were situations where the same values had different attribute values by time period, or when personnel changes left no historical knowledge. These cases are beyond our ability to resolve.
Therefore, we're planning a pre-validation system. We want to create a tool that automatically analyzes and reports whether prediction models can be applied when data is uploaded to a standardized form. Being able to assess data quality before contracts would benefit everyone involved.
Simply providing prediction values alone isn't sufficient to deliver adequate value to customers. Looking at actual field operations, there's a process where prediction values are referenced, but production teams and sales operations teams make final decisions through meetings.
To provide better insights for decision-making processes, we need to explain not just prediction results but also prediction rationale, external factors affecting the domain, and economic indicators. Furthermore, our goal is to evolve beyond simple data viewing into a communication tool for ordering and production decisions.
Observing the communication methods of people handling actual work, most use Excel. We're improving these aspects with our tools to reduce communication costs while providing prediction values, developing them into decision-making tools.
Predictions are inherently bound to be wrong. The direction we want to develop is combining customers' existing work methods with prediction methodologies to create more sophisticated models.
For example, if we predict based on factors A, B, C, and D, we should be able to clearly explain each factor's impact on the results. Even when predictions are wrong, if we can explain which factors acted differently than expected, customers can understand. Simply presenting values leaves nothing to say when wrong, but having logical grounds makes a difference.
If we approach this with the concept of AI replacing staff, field practitioners may not prefer this. This could ultimately result in our system DeepFlow being rejected.
Therefore, rather than the concept of AI replacing people, I want to move toward providing help to practitioners and enhancing work efficiency. We intend to reflect their experiences in our solutions through future interviews with field staff.
AI is AI, but if those staff members' experiences are included, we can't ignore that expertise. Rather than AI just throwing out values, it thinks alongside me and says 'I did this, so you can finish in one day what used to take a week.' I think approaching it this way might work.
Furthermore, if it becomes a communication tool for determining order quantities, people might think 'there's a tool that can conveniently manage things we used to do haphazardly with just Excel,' and this direction might appeal to them.
Yes, we have. Particularly those who've been in purchasing or ordering for a long time have pride in their experience and know-how. When AI suddenly comes and says it will do what they've been doing, they naturally feel resistance.
That's why we try to approach this from the perspective that AI and human experience combine to create better results. AI doesn't simply present answers but serves as a partner that assists staff thinking processes and increases work efficiency.
As a technology company providing AI prediction solutions, we believe our role is to provide the best insights through accurate prediction models and data analysis. However, like investment or consulting, we view final business decisions as the customer's responsibility.
What we provide is data-based prediction information and evidence-based analysis, and we think business judgments based on this should appropriately combine customers' expertise and situational awareness. Therefore, we focus on improving prediction accuracy and providing better insights, while customers make optimal decisions referencing this - we aim for this collaborative relationship.
When prediction accuracy doesn't reach expected levels, customers may reconsider service usage. We fully understand this. We accept this reality coldly and consider overcoming it well as an important task we must solve. This is where our strong R&D motivation lies.
Particularly extreme unpredictable situations or rapid market paradigm shifts are areas that any prediction model would find difficult to handle perfectly. While clearly acknowledging these limitations, we focus on continuously improving prediction accuracy in general market conditions. We also think it's important to clearly share the application scope and limitations of prediction models through transparent communication with customers.
We're approaching transparency and verifiability of prediction results carefully. Rather than appealing performance for marketing purposes, we prefer having customers verify actual performance in real business environments and having such practical recognition spread.
Publicly displaying prediction results can have various risk factors. Particularly for financial products or raw material price predictions, there's potential for investment-related misunderstandings, requiring even more caution. Instead, we're approaching this through case studies in collaboration with customers or anonymized performance indicators to prove our technology's effectiveness.
Actually, inventory management is an area we're keeping some distance from. What we want to do extends to order quantity decisions - how much should be produced.
Because there are already many inventory management tools available. And ERP systems also have inventory management functions. But inventory management is said to be the one element that hasn't been standardized globally. That's why even companies like SAP don't sell inventory management modules in preset form - they ultimately have to customize for each company to use inventory management.
We're a company researching demand forecasting and such areas, but incorporating inventory management functions into our solution feels like putting the cart before the horse. So we thought, let's do what inventory management solutions can't do, and use AI well here to provide higher-value services to customers.
Rather than inventory management, it's like daily summaries. Yesterday's situation was like this, so today please check and respond to these things. We're thinking about providing such services.
While inventory management solutions focus on organizing past data and understanding current situations, we have strengths in predicting the future and presenting action items accordingly.
For example, B2C companies check and respond to inventory daily. Since we already have customer inventory and sales data, we can make inferences to summarize and notify in advance what inventory responses or orders need to be made quickly.
Yes, we plan to diversify prediction cycles and customize prediction methods to match customer tasks. Currently, customers mainly need monthly predictions, so we primarily provide monthly forecasting. However, customers needing weekly units are increasing, so we're responding to this as well. Weekly and monthly require different data patterns and algorithms, so they must be built as separate model systems.
For clothing companies and B2C enterprises, there's strong demand for monitoring inventory situations daily and making quick decisions. We're also developing features that automatically generate insights and action items for such daily operational optimization using our inventory and sales data. Our goal is to evolve beyond simple prediction value provision into real-time decision support tools.
We basically have various daily prediction models. Daily units have high data volatility, requiring more sophisticated approaches for reliable pattern identification. However, depending on specific industry characteristics or business situations, we believe we can provide sufficiently meaningful insights.
What we focus on is the practical utilization of prediction results rather than the prediction cycle itself. For example, rather than daily sales volume prediction, automatically identifying and presenting high-priority response tasks by comprehensively analyzing daily business situations can provide more direct value to customers.
We're thinking about features where customers can directly define specific factors. For example, what happens if marketing budget increases by 20%, what happens if new competitors enter - setting such assumptions and inferring how prediction values would change.
With such features, rather than receiving just one prediction value, customers could develop response strategies for various scenarios. Currently we only provide one result we consider best, but we're considering features that allow viewing from various perspectives.
Responding to team members' various questions and technical issues makes days pass quickly. Especially with many junior personnel, providing technical direction and helping solve problems takes up a considerable portion.
As a technical leader, I'm always thinking about making the best decisions. Establishing technology strategies for the company's long-term growth while efficiently solving immediate challenges is what I'd call my main work.
The most important thing is building systematic data management systems. We too realized the importance of this through various trials and errors.
Particularly, deep understanding of customers' business domains must come first. Since technical excellence alone has limitations in creating actual business value, I think accumulating domain expertise through customer-specific customized approaches is key to success. Sometimes the process of deeply understanding individual customer needs can be more valuable than standardized solutions.
Maintaining prudence in technology choices and architecture decisions. If initial decisions are wrong, you might face situations requiring complete system reconstruction later, so decisions considering both long-term scalability and stability are necessary.
Also, creating an environment where team members can grow on stable technical foundations is important. I think finding the balance point between providing technical challenges and learning opportunities while securing execution power for business goal achievement is key.
AI technology is rapidly developing, but there are still many areas requiring verification. Particularly with generative AI, there are cases where plausible but inaccurate information is provided, so we always go through verification processes when applying to actual work.
We comprehensively evaluate both implementation possibility and reliability when introducing new technologies. While technological innovation is important, we prioritize whether it can be stably applied to actual customer services. Through this careful approach, we aim to provide customers with trustworthy solutions by applying only verified technologies to services.
Actually, I think now is the time to focus on solidly building the company's technical foundation. While solving current technical challenges step by step alongside long-term vision, I'm continuously working to become a better technical leader.
As AI-based demand forecasting technology rapidly develops, DeepFlow's approach is attracting attention for being based on technical excellence combined with customers' actual business needs and deep philosophy about AI-human collaboration. From AutoML technology combining over 200 prediction models to pipelines flexibly processing complex customer data, and AI's role as a collaborative partner rather than human replacement, DeepFlow's journey presents the direction the AI prediction industry should take. Particularly, the effort to provide substantial value to customers through continuous innovation while honestly acknowledging technology limitations is a lesson many AI companies should take to heart.