Skip to main content
Log in

LAMEE: a light all-MLP framework for time series prediction empowering recommendations

  • Published:
World Wide Web Aims and scope Submit manuscript

Abstract

Exogenous variables, unrelated to the recommendation system itself, can significantly enhance its performance. Therefore, integrating these time-evolving exogenous variables into a time series and conducting time series predictions can maximize the potential of recommendation systems. We refer to this task as Time Series Prediction Empowering Recommendations (TSPER). However, as a subtask within the recommendation system, TSPER faces unique challenges such as computational and data constraints, system evolution, and the need for performance and interpretability. To meet these unique needs, we propose a lightweight Multi-Layer Perceptron architecture with joint Time-Frequency information, named Light All-MLP with joint TimE-frEquency information (LAMEE). LAMEE utilizes a lightweight MLP architecture to achieve computing efficiency and adaptive online learning. Moreover, various strategies have been employed to improve the model, ensuring stable performance and model interpretability. Across multiple time series datasets potentially related to recommendation systems, LAMEE balances performance, efficiency, and interpretability, overall surpassing existing complex methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data Availability

No datasets were generated or analysed during the current study.

Notes

  1. Here, we use single side spectrum for simplification, resulting in there are \(\lfloor \frac{(J+O)}{2}\rfloor +1\) frequency components, rather than \((J+O)\).

  2. https://gis.cdc.gov/grasp/fluview/fluportaldashboard.html

  3. https://www.csmar.com/channels/31.html

  4. http://pems.dot.ca.gov/

  5. https://github.com/zhouhaoyi/ETDataset

References

  1. Jannach, D., Zanker, M., Felfernig, A., Friedrich, G.: Recommender systems: an introduction (2010)

  2. Guo, Q., Zhuang, F., Qin, C., Zhu, H., Xie, X., Xiong, H., He, Q.: A survey on knowledge graph-based recommender systems. IEEE Trans. Knowl. Data Eng. 34(8), 3549–3568 (2020)

    Article  Google Scholar 

  3. Sun, R., Cao, X., Zhao, Y., Wan, J., Zhou, K., Zhang, F., Wang, Z., Zheng, K.: Multi-modal knowledge graphs for recommender systems. In: Proceedings of the 29th ACM International Conference on Information & Knowledge Management, pp. 1405–1414 (2020)

  4. Huang, C., Xu, H., Xu, Y., Dai, P., Xia, L., Lu, M., Bo, L., Xing, H., Lai, X., Ye, Y.: Knowledge-aware coupled graph neural network for social recommendation. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 4115–4122 (2021)

  5. Sharaf, M., Hemdan, E.E.-D., El-Sayed, A., El-Bahnasawy, N.A.: A survey on recommendation systems for financial services. Multimed. Tools Appl. 81(12), 16761–16781 (2022)

    Article  Google Scholar 

  6. Xue, J., Zhu, E., Liu, Q., Yin, J.: Group recommendation based on financial social network for robo-advisor. IEEE Access 6, 54527–54535 (2018)

    Article  Google Scholar 

  7. Nilashi, M., Asadi, S., Minaei-Bidgoli, B., Abumalloh, R.A., Samad, S., Ghabban, F., Ahani, A.: Recommendation agents and information sharing through social media for coronavirus outbreak. Telemat. Inform. 61, 101597 (2021)

    Article  Google Scholar 

  8. Hussain, M.M.-u., Avci, B., Trajcevski, G., Scheuermann, P.: Incorporating weather updates for public transportation users of recommendation systems. In: 2016 17th IEEE International Conference on Mobile Data Management (MDM), vol. 1, pp. 333–336 (2016). IEEE

  9. Djavadian, S., Hoogendoorn, R.G., Van Arerm, B., Chow, J.Y.: Empirical evaluation of drivers’ route choice behavioral responses to social navigation. Transportation research record 2423(1), 52–60 (2014)

    Article  Google Scholar 

  10. Hussain, M.M.-u., Avci, B., Trajcevski, G., Scheuermann, P.: Incorporating weather updates for public transportation users of recommendation systems. In: 2016 17th IEEE International Conference on Mobile Data Management (MDM), vol. 1, pp. 333–336 (2016). IEEE

  11. Yin, H., Zhou, X., Cui, B., Wang, H., Zheng, K., Nguyen, Q.V.H.: Adapting to user interest drift for poi recommendation. IEEE Trans. Knowl. Data Eng. 28(10), 2566–2581 (2016)

    Article  Google Scholar 

  12. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł, Polosukhin, I.: Attention is all you need. Adv. Neural Inf. Process. Syst. 30 (2017)

  13. Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H., Zhang, W.: Informer: Beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 11106–11115 (2021)

  14. Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural Inf. Process. Syst. 34, 22419–22430 (2021)

    Google Scholar 

  15. Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. International Conference on Learning Representations (2022)

  16. Liu, S., Yu, H., Liao, C., Li, J., Lin, W., Liu, A.X., Dustdar, S.: Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting. In: International Conference on Learning Representations (2021)

  17. Lu, J., Liu, A., Dong, F., Gu, F., Gama, J., Zhang, G.: Learning under concept drift: A review. IEEE Trans. Knowl. Data Eng. 31(12), 2346–2363 (2018)

    Google Scholar 

  18. Zenisek, J., Holzinger, F., Affenzeller, M.: Machine learning based concept drift detection for predictive maintenance. Comput. Ind. Eng. 137, 106031 (2019)

    Article  Google Scholar 

  19. Ding, C., Zhao, J., Sun, S.: Concept drift adaptation for time series anomaly detection via transformer. Neural Process. Lett. 55(3), 2081–2101 (2023)

    Article  Google Scholar 

  20. Zhao, Z., Xu, J., Zang, Y., Hu, R.: Adaptive abnormal oil temperature diagnosis method of transformer based on concept drift. Appl. Sci. 11(14), 6322 (2021)

    Article  Google Scholar 

  21. Trybulec, W.A.: Vectors in real linear space. Formalized Math. 1(2), 291–296 (1990)

    Google Scholar 

  22. Brand, L.: Vector and Tensor Analysis. Courier Dover Publications, ??? (2020)

  23. Arjovsky, M., Bottou, L.: Towards principled methods for training generative adversarial networks. arXiv:1701.04862 (2017)

  24. Pallás-Areny, R., Webster, J.G.: Analog Signal Processing. John Wiley & Sons, ??? (1999)

  25. Orfanidis, S.J.: Introduction to Signal Processing. Prentice-Hall, Inc., ??? (1995)

  26. Kulkarni, S., Rodd, S.F.: Context aware recommendation systems: A review of the state of the art techniques. Comput. Sci. Rev. 37, 100255 (2020)

    Article  MathSciNet  Google Scholar 

  27. Box, G.E., Jenkins, G.M.: Some recent advances in forecasting and control. J. R. Stat. Soc. Ser. C Appl. Stat. 17(2), 91–109 (1968)

    MathSciNet  Google Scholar 

  28. Box, G.E., Jenkins, G.M., Reinsel, G.C., Ljung, G.M.: Time Series Analysis: Forecasting and Control. John Wiley & Sons, ??? (2015)

  29. Gardner, E.S., Jr.: Exponential smoothing: The state of the art. J. Forecast. 4(1), 1–28 (1985)

    Article  MathSciNet  Google Scholar 

  30. Jalles, J.T.: Structural time series models and the kalman filter: a concise review (2009)

  31. Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 1189–1232 (2001)

  32. Elsayed, S., Thyssens, D., Rashed, A., Jomaa, H.S., Schmidt-Thieme, L.: Do we really need deep learning models for time series forecasting? arXiv:2101.02118 (2021)

  33. Wen, R., Torkkola, K., Narayanaswamy, B., Madeka, D.: A multi-horizon quantile recurrent forecaster. arXiv:1711.11053 (2017)

  34. Rangapuram, S.S., Seeger, M.W., Gasthaus, J., Stella, L., Wang, Y., Januschowski, T.: Deep state space models for time series forecasting. Adv. Neural Inf. Process. Syst. 31 (2018)

  35. Maddix, D.C., Wang, Y., Smola, A.: Deep factors with gaussian processes for forecasting. arXiv:1812.00098 (2018)

  36. Oord, A.v.d., Dieleman, S., Zen, H., Simonyan, K., Vinyals, O., Graves, A., Kalchbrenner, N., Senior, A., Kavukcuoglu, K.: Wavenet: A generative model for raw audio. arXiv:1609.03499 (2016)

  37. Borovykh, A., Bohte, S., Oosterlee, C.W.: Conditional time series forecasting with convolutional neural networks. arXiv:1703.04691 (2017)

  38. Bai, S., Kolter, J.Z., Koltun, V.: An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv:1803.01271 (2018)

  39. Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359–366 (1989)

    Article  Google Scholar 

  40. Ruck, D.W., Rogers, S.K., Kabrisky, M.: Feature selection using a multilayer perceptron. J. Neural Netw. Comput. 2(2), 40–48 (1990)

    Google Scholar 

  41. Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359–366 (1989)

    Article  Google Scholar 

  42. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. Commun. ACM 60(6), 84–90 (2012)

    Article  Google Scholar 

  43. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9, 1735–1780 (1997)

    Article  Google Scholar 

  44. Khan, S., Naseer, M., Hayat, M., Zamir, S.W., Khan, F.S., Shah, M.: Transformers in vision: A survey. ACM Comput. Surv. 54(10s), 1–41 (2022)

    Article  Google Scholar 

  45. Tay, Y., Dehghani, M., Bahri, D., Metzler, D.: Efficient transformers: A survey. ACM Computing Surveys (CSUR) (2020)

  46. Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform bad for graph representation? In: Neural Information Processing Systems (2021)

  47. Wen, Q., Zhou, T., Zhang, C., Chen, W., Ma, Z., Yan, J., Sun, L.: Transformers in time series: A survey. arXiv:2202.07125 (2022)

  48. Liu, R., Li, Y., Tao, L., Liang, D., Zheng, H.-T.: Are we ready for a new paradigm shift? a survey on visual deep mlp. Patterns 3(7), 100520 (2022)

    Article  Google Scholar 

  49. Tolstikhin, I.O., Houlsby, N., Kolesnikov, A., Beyer, L., Zhai, X., Unterthiner, T., Yung, J., Steiner, A., Keysers, D., Uszkoreit, J., et al.: Mlp-mixer: An all-mlp architecture for vision. Adv. Neural Inf. Process. Syst. 34, 24261–24272 (2021)

    Google Scholar 

  50. Melas-Kyriazi, L.: Do you even need attention? a stack of feed-forward layers does surprisingly well on imagenet. arXiv:2105.02723 (2021)

  51. Hendrycks, D., Gimpel, K.: Gaussian error linear units (gelus). arXiv:1606.08415 (2016)

  52. Nussbaum, M.: Advanced digital signal processing and noise reduction (2016)

  53. Persons, W.M.: Indices of business conditions: an index of general business conditions (1919)

  54. Zeng, A., Chen, M., Zhang, L., Xu, Q.: Are transformers effective for time series forecasting? arXiv:2205.13504 (2022)

  55. Wu, Z., Pan, S., Long, G., Jiang, J., Chang, X., Zhang, C.: Connecting the dots: Multivariate time series forecasting with graph neural networks. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 753–763 (2020)

  56. Liu, Y., Wu, H., Wang, J., Long, M.: Non-stationary transformers: Exploring the stationarity in time series forecasting. In: Advances in Neural Information Processing Systems (2022)

  57. Woo, G., Liu, C., Sahoo, D., Kumar, A., Hoi, S.: Etsformer: Exponential smoothing transformers for time-series forecasting. arXiv:2202.01381 (2022)

  58. Wang, H., Peng, J., Huang, F., Wang, J., Chen, J., Xiao, Y.: Micn: Multi-scale local and global context modeling for long-term series forecasting. In: The Eleventh International Conference on Learning Representations (2023)

  59. Zhang, T., Zhang, Y., Cao, W., Bian, J., Yi, X., Zheng, S., Li, J.: Less is more: Fast multivariate time series forecasting with light sampling-oriented mlp structures. arXiv:2207.01186 (2022)

  60. Zhang, Y., Yan, J.: Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. In: The Eleventh International Conference on Learning Representations (2023)

  61. Wu, H., Hu, T., Liu, Y., Zhou, H., Wang, J., Long, M.: Timesnet: Temporal 2d-variation modeling for general time series analysis (2023)

  62. Yu, P., Artetxe, M., Ott, M., Shleifer, S., Gong, H., Stoyanov, V., Li, X.: Efficient language modeling with sparse all-mlp. arXiv:2203.06850 (2022)

  63. Navon, D., Bronstein, A.M.: Transformer vs. mlp-mixer exponential expressive gap for nlp problems. arXiv:2208.08191 (2022)

  64. D’Urso, P., De Giovanni, L., Massari, R.: Trimmed fuzzy clustering of financial time series based on dynamic time warping. Ann. Oper. Res. 299, 1379–1395 (2021)

    Article  MathSciNet  Google Scholar 

  65. Penfold, R.B., Zhang, F.: Use of interrupted time series analysis in evaluating health care quality improvements. Acad. Pediatr. 13(6), S38–S44 (2013)

    Article  Google Scholar 

  66. Ding, S., Li, R., Wu, S., Zhou, W.: Application of a novel structure-adaptative grey model with adjustable time power item for nuclear energy consumption forecasting. Appl. Energy 298, 117114 (2021)

    Article  Google Scholar 

  67. Zhao, J., Chen, C., Zhang, W., Li, R., Gu, F., Guo, S., Luo, J., Zheng, Y.: Coupling makes better: an intertwined neural network for taxi and ridesourcing demand co-prediction. IEEE Transactions on Intelligent Transportation Systems. IEEE (2023)

  68. Zhou, Z.-H.: Rehearsal: Learning from prediction to decision. Front. Comput. Sci. 16(4), 164352 (2022)

    Article  Google Scholar 

Download references

Funding

This work is funded in part by the National Natural Science Foundation of China Projects No. U1936213, and the Shanghai Science and Technology Development Fund No. 19DZ1200802. National Natural Science Foundation of China under Grant 62322601; the Excellent Youth Foundation of Chongqing under Grant CSTB2023NSCQJQX0025.

Author information

Authors and Affiliations

Authors

Contributions

Yi Xie proposed the main idea, completed the main experiments, and wrote the main manuscript text Yun Xiong, Xiaofeng Gao and Chao Chen provided suggestions and supervision for the general experiment settings, and participated in polishing this manuscript. Xian Wu assisted in completing the experiments. All authors reviewed the manuscript.

Corresponding author

Correspondence to Xiaofeng Gao.

Ethics declarations

Ethical approval

This study did not involve any experiments on human or animal subjects. Therefore, ethical approval was not required.

Competing interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article belongs to the Topical Collection: Special Issue on Advancing recommendation systems with foundation models

Guest Editors: Kai Zheng, Renhe Jiang, and Ryosuke Shibasaki.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xie, Y., Xiong, Y., Gao, X. et al. LAMEE: a light all-MLP framework for time series prediction empowering recommendations. World Wide Web 27, 13 (2024). https://doi.org/10.1007/s11280-024-01251-w

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11280-024-01251-w

Keywords

Navigation