A Deep Learning-based Prefetching Approach to Enable Scalability for Data-intensive Applications

Buyuktanir T., AKTAŞ M. S.

2022 IEEE International Conference on Big Data, Big Data 2022, Osaka, Japan, 17 - 20 December 2022, pp.2716-2721 identifier

  • Publication Type: Conference Paper / Full Text
  • Doi Number: 10.1109/bigdata55660.2022.10020591
  • City: Osaka
  • Country: Japan
  • Page Numbers: pp.2716-2721
  • Keywords: bi-lstm, lstm, mobile prefetching, sequential prediction, web prefetching
  • Yıldız Technical University Affiliated: Yes


Fast data delivery to users is a big challenge in client-server architecture-based data-intensive systems. Here, prefetching is a widely used technique to increase data-intensive mobile or web applications' operational performance and data delivery speed. This study proposes a methodology that can address this challenge using deep-learning-enabled prefetching approaches. The proposed methodology minimizes the data-access latency by adopting an approach that models the customer navigational browsing data and predicts the following user actions. The proposed approach utilizes two different recurrent neural network methods to model the clickstream data. These methods include LSTM and bi-directional LSTM. To show the usability of the proposed methodology, we provide a prototype implementation. To this end, we use a public dataset obtained from log files of a coffee store mobile application. In the prototype implementation, because the small number of users accounted for the majority of the load on the system, we segmented users as active and cold users. To facilitate testing of the prototype implementation, we conducted an experimental study. Here, we investigated whether the system can predict users' near-future requests. In the experimental study, we record the cache hit rates. The results show that the proposed prefetching is promising and can be utilized in client-server-based data-intensive applications.