Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Achieving Sales Forecasting with Higher Accuracy and Efficiency: A New Model Based on Modified Transformer
Journal of Theoretical and Applied Electronic Commerce Research ( IF 5.318 ) Pub Date : 2023-11-02 , DOI: 10.3390/jtaer18040100
Qianying Li 1 , Mingyang Yu 1
Affiliation  

With the exponential expansion of e-commerce, an immense volume of historical sales data has been generated and amassed. This influx of data has created an opportunity for more accurate sales forecasting. While various sales forecasting methods and models have been applied in practice, existing ones often struggle to fully harness sales data and manage significant fluctuations. As a result, they frequently fail to make accurate predictions, falling short of meeting enterprise needs. Therefore, it is imperative to explore new models to enhance the accuracy and efficiency of sales forecasting. In this paper, we introduce a model tailored for sales forecasting based on a Transformer with encoder–decoder architecture and multi-head attention mechanisms. We have made specific modifications to the standard Transformer model, such as removing the Softmax layer in the last layer and adapting input embedding, position encoding, and feedforward network components to align with the unique characteristics of sales forecast data and the specific requirements of sales forecasting. The multi-head attention mechanism in our proposed model can directly compute the dot product results in a single step, addressing long-term time-dependent computation challenges while maintaining lower time complexity and greater interpretability. This enhancement significantly contributes to improving the model’s accuracy and efficiency. Furthermore, we provide a comprehensive formula representation of the model for the first time, facilitating better understanding and implementation. We conducted experiments using sales datasets that incorporate various factors influencing sales forecasts, such as seasons, holidays, and promotions. The results demonstrate that our proposed model significantly outperforms seven selected benchmark methods, reducing RMSLE, RMSWLE, NWRMSLE, and RMALE by approximately 48.2%, 48.5%, 45.2, and 63.0%, respectively. Additionally, ablation experiments on the multi-head attention and the number of encoder–decoders validate the rationality of our chosen model parameters.

中文翻译:

实现更准确、更高效的销售预测:基于改进变压器的新模型

随着电子商务的指数级扩张,产生并积累了大量的历史销售数据。数据的涌入为更准确的销售预测创造了机会。尽管各种销售预测方法和模型已在实践中得到应用,但现有的方法和模型往往难以充分利用销售数据并管理显着的波动。因此,他们经常无法做出准确的预测,无法满足企业的需求。因此,探索新的模型来提高销售预测的准确性和效率势在必行。在本文中,我们介绍了一种基于带有编码器-解码器架构和多头注意力机制的 Transformer 的销售预测定制模型。我们对标准 Transformer 模型进行了具体修改,例如删除最后一层的 Softmax 层,并调整输入嵌入、位置编码和前馈网络组件,以适应销售预测数据的独特特征和销售预测的具体要求。我们提出的模型中的多头注意力机制可以直接一步计算点积结果,解决长期依赖时间的计算挑战,同时保持较低的时间复杂度和更大的可解释性。这一增强显着有助于提高模型的准确性和效率。此外,我们首次提供了模型的全面公式表示,有助于更好的理解和实施。我们使用销售数据集进行了实验,其中包含影响销售预测的各种因素,例如季节、假期和促销。结果表明,我们提出的模型显着优于七个选定的基准方法,将 RMSLE、RMSWLE、NWRMSLE 和 RMALE 分别降低了约 48.2%、48.5%、45.2 和 63.0%。此外,多头注意力和编码器解码器数量的消融实验验证了我们选择的模型参数的合理性。
更新日期:2023-11-02
down
wechat
bug