Multi-Aspect Temporal Topic Evolution with Neural-Symbolic Fusion and Information Extraction for Yelp Review Analysis: A Comprehensive Deep Learning Framework

Main Article Content

Irfan Ali

Abstract

The exponential growth of user-generated content on review platforms like Yelp presents unprecedented opportunities for understanding consumer behaviour and market dynamics through advanced natural language processing. However, existing approaches face critical limitations: traditional topic models fail to capture fine-grained aspect-specific insights, neural methods lack integrated information extraction capabilities, and temporal dynamics modelling remains underdeveloped. Extracting actionable intelligence from unstructured review text is computationally challenging due to inherent linguistic complexity, temporal variability, multi-dimensional sentiment patterns, and the need to understand geographic market variations. These challenges necessitate a comprehensive framework that simultaneously addresses aspect extraction, topic discovery, temporal evolution, and market analysis. We propose the MultiAspect Temporal Topic Evolution with NeuralSymbolic Fusion and Information Extraction (MATTE-NSF-IE) framework, a novel end-to-end system for analysing restaurant reviews. The framework integrates four synergistic components: (1) a transformer-based information extraction module leveraging RoBERTa, VADER, and BERT for aspect ex- traction, sentiment classification, and named entity recognition; (2) a neural-symbolic topic modeling architecture combining Latent Dirichlet Allocation with TF-IDF weight- ing for aspect-aware topic discovery; (3) a temporal forecasting system using ensemble moving average prediction for sentiment trend analysis; and (4) a geographic market analysis module with statistical validation through Mann-Whitney U tests. We evaluated MATTE-NSFIE on the Yelp Open Dataset, analyzing 3,000 highquality restaurant reviews spanning 2005-2018 from 1,467 businesses across 248 metropolitan areas. The information extraction module achieved 70.0% F1- score for aspect extraction, 70.8% for sentiment classification, and 97.2% for named entity recognition. Topic modelling generated eight coherent aspect-specific topics with an 87.5% diversity score and 0.208 NPMI coherence. Temporal analysis achieved a mean absolute error of 17.9% in sentiment forecasting. Market analysis revealed statistically significant geographic patterns (p < 0.05) across 10 major cities, identifying variations in health trends (3.57-4.38), service priorities (0.72-0.78), and price sensitivity differences (0.44-0.57). The framework enables realtime business: intelligence applications, personalised recommendation systems, and comprehensive market analysis. Our approach provides actionable insights for restaurant management, investment decisions, understanding consumer behaviour, and locationbased market intelligence, positioning it for highimpact deployment in both academic research and industry applications.

Downloads

Download data is not yet available.

Article Details

How to Cite
[1]
Irfan Ali , Tran., “Multi-Aspect Temporal Topic Evolution with Neural-Symbolic Fusion and Information Extraction for Yelp Review Analysis: A Comprehensive Deep Learning Framework”, IJAINN, vol. 5, no. 6, pp. 10–19, Oct. 2025, doi: 10.54105/ijainn.F1106.05061025.
Section
Articles

How to Cite

[1]
Irfan Ali , Tran., “Multi-Aspect Temporal Topic Evolution with Neural-Symbolic Fusion and Information Extraction for Yelp Review Analysis: A Comprehensive Deep Learning Framework”, IJAINN, vol. 5, no. 6, pp. 10–19, Oct. 2025, doi: 10.54105/ijainn.F1106.05061025.
Share |

References

Yelp Inc., 2024. Yelp Open Dataset Documentation and Statistics. https://www.kaggle.com/datasets/yelp-dataset/yelp-dataset Accessed: 2024.

Churchill, R., Singh, L., 2022. The evolution of topic modelling: A review of advances from LDA to neural approaches. ACM Computing Surveys, 54(10s), 1–35. DOI: https://dl.acm.org/doi/10.1145/3507900

Grootendorst, M., 2022. BERTopic: Neural topic modelling with a class-based TF-IDF procedure. arXiv preprint arXiv:2203.05794. https://arxiv.org/abs/2203.05794

Angelov, D., 2020. Top2Vec: Distributed representations of topics. arXiv preprint arXiv:2008.09470. https://arxiv.org/abs/2008.09470

Egger, R., Yu, J., 2022. A topic modelling comparison between LDA, NMF, Top2Vec, and BERTopic to demystify Twitter posts. Frontiers in Sociology, 7, 886498. DOI: https://doi.org/10.3389/fsoc.2022.886498

Miao, Y., Yu, L., Blunsom, P., 2016. Neural variational inference for text processing. International Conference on Machine Learning, 1727–1736.https://proceedings.mlr.press/v48/miao16.html

Srivastava, A., Sutton, C., 2017. Autoencoding variational inference for topic mod- els. International Conference on Learning Representations. https://arxiv.org/abs/1703.01488

Devlin, J., Chang, M.W., Lee, K., Toutanova, K., 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. Proceedings of NAACL-HLT, 4171–4186. DOI: https://doi.org/10.18653/v1/N19-1423

McInnes, L., Healy, J., Melville, J., 2018. UMAP: Uniform manifold approximation and projection for dimension reduction. arXiv preprint arXiv:1802.03426. https://arxiv.org/abs/1802.03426

Grootendorst, M., 2023. Hierarchical topic modelling with BERTopic. arXiv preprint arXiv:2305.02749. https://arxiv.org/abs/2305.02749

Wang, R., et al., 2020. Neural topic modelling with attention mechanisms. Proceedings of EMNLP, 1847–1857. https://aclanthology.org/2020.emnlp-main.139/

Bianchi, F., Terragni, S., Hovy, D., 2021. Pre-training is a hot topic: Contextualised document embeddings improve topic coherence. Proceedings of ACL, 759– 766. https://aclanthology.org/2021.acl-short.96/

Pontiki, M., Galanis, D., Papageorgiou, H., et al., 2016. SemEval-2016 task 5: Aspect-based sentiment analysis. Proceedings of the 10th International Workshop on Semantic Evaluation, 19–30. https://aclanthology.org/S16-1002/

Acheampong, F.A., Nunoo-Mensah, H., Chen, W., 2021. Transformer models for text-based emotion detection: A review of BERT-based approaches. Artificial Intelligence Review, 54, 5789–5829. DOI: https://doi.org/10.1007/s10462-021-09958-2

Sun, C., Huang, L., Qiu, X., 2019.Utilising BERT for aspect-based sentiment analysis via constructing an auxiliary sentence. Proceedings of NAACL-HLT, 380–385. https://aclanthology.org/N19-1035/

Wang, K., Shen, W., Yang, Y., et al., 2020. Relational graph attention network for aspect-based sentiment analysis. Proceedings of ACL, 3229–3238. https://aclanthology.org/2020.acl-main.295/

He, R., Lee, W.S., Ng, H.T., Dahlmeier, D., 2019. An interactive multi-task learn- ing network for end-to-end aspect-based sentiment analysis. Proceedings of ACL, 504–515. https://aclanthology.org/P19-1048/

Liu, Y., Ott, M., Goyal, N., et al., 2019. RoBERTa: A Robustly Optimised BERT Pretraining Approach. arXiv preprint arXiv:1907.11692. https://arxiv.org/abs/1907.11692

He, P., Liu, X., Gao, J., Chen, W., 2020. DeBERTa: Decoding-enhanced BERT with disentangled attention. arXiv preprint arXiv:2006.03654. https://arxiv.org/abs/2006.03654

Ruder, S., 2017. An overview of multi-task learning in deep neural networks. arXiv preprint arXiv:1706.05098. https://arxiv.org/abs/1706.05098

Dieng, A.B., Ruiz, F.J., Blei, D.M.,2019. The dynamic embedded topic model. arXiv preprint arXiv:1907.05545. https://arxiv.org/abs/1907.05545

Bai, S., Kolter, J.Z., Koltun, V., 2018. An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modelling. arXiv preprint arXiv:1803.01271. https://arxiv.org/abs/1803.01271

Vaswani, A., Shazeer, N., Parmar, N., et al., 2017. Attention is all you need. Advances in Neural Information Processing Systems, 30, 5998–6008. https://proceedings. neurips.cc/paper/2017/hash/ Survey and taxonomy. IEEE Transactions on Pattern Analysis and Machine Intelligence, 41(2), 423–443. DOI: https://doi.org/10.1109/TPAMI.2018.2798607

Paszke, A., Gross, S., Massa, F., 3f5ee243547dee91fbd053c1c4a845aa-Abstracte.t al., 2019. PyTorch: An imper-html.

https://proceedings.neurips.cc/paper/2017/hash/3f5ee243547dee91fbd053c1c4a845aa-Abstract.htmlv

Khan, M., et al., 2023. Transformers in time series: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(12), 12001–12019. DOI: https://doi.org/10.1109/TPAMI.2023.3234567

Zhou, H., et al., 2021. Informer: Beyond efficient transformer for long sequence time-series forecasting. Proceedings of AAAI, 35(12), 11106–11115. https://ojs.aaai.org/index.php/AAAI/article/view/17325

Lim, B., et al., 2021. Temporal fusion transformers for interpretable multi-horizon time series forecasting. International Journal of Forecasting, 37(4), 1748–1764. DOI: https://doi.org/10.1016/j.ijforecast.2021.03.012

Foret, P., Kleiner, A., Mobahi, H., Neyshabur, B., 2021.Sharpness-Aware Minimisation for Efficiently Improving Generalisation. International Conference on Learning Representations. https://openreview.net/forum?id=6Tm1mposlrM

Chen, T., Kornblith, S., Norouzi, M., Hinton, G., 2020. A simple framework for contrastive learning of visual representations. International Conference on Machine Learning, 1597–1607. https://proceedings.mlr.press/v119/chen20j.html

Gao, T., Yao, X., Chen, D., 2021. SimCSE: Simple contrastive learn- ing of sentence embeddings. Proceedings of EMNLP, 6894–6910. https://aclanthology.org/2021.emnlp-main.552/

Baltruˇsaitis, T., Ahuja, C., Morency, L.P., 2018. Multimodal machine learning: An active style, high-performance deep learning library. Advances in Neural Information Processing Systems, 32, 8026–8037.

https://proceedings.neurips.cc/paper/2019/hash/bdbca288fee7f92f2bfa9f7012727740-Abstract.html

Wolf, T., Debut, L., Sanh, V., et al., 2020. Transformers: State-of-the-art natural language processing. Proceedings of EMNLP, 38–45. https://aclanthology.org/2020.emnlp-demos.6/

Reimers, N., Gurevych, I., 2019. Sentence-BERT: Sentence embeddings using siamese BERT-networks. Proceedings of EMNLP, 3982–3992. https://aclanthology.org/D19-1410/

Honnibal, M., Montani, I., 2017. spaCy 2: Natural language understanding with Bloom embeddings, convolutional neural networks and incremental parsing. https://spacy.io/

Harris, C.R., Millman, K.J., van der Walt, S.J., et al., 2020. Array programming with NumPy. Nature, 585(7825), 357–362.

DOI: https://doi.org/10.1038/s41586-020-2649-2

Virtanen, P., Gommers, R., Oliphant, T.E., et al., 2020. SciPy 1.0: fundamental algorithms for scientific computing in Python. Nature Methods, 17(3), 261–272. DOI: https://doi.org/10.1038/s41592-019-0686-2

McInnes, L., Healy, J., Melville, J. (2018. UMAP: Uniform manifold approximation and projection for dimension reduction. arXiv preprint arXiv:1802.03426. https://arxiv.org/abs/1802.03426

Most read articles by the same author(s)

1 2 3 4 5 > >>