• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索
Sort by:
Default
  • Default
  • Title
  • Year
  • WOS Cited Count
  • Scoups Cited Count
  • CNKI Cited Count
  • Wanfang Cited Count
  • CQVIP Cited Count
  • Impact factor
  • Ascending
  • Descending
< Page ,Total 40 >
Leveraging Large Language Model ChatGPT for enhanced understanding of end-user emotions in social media feedbacks EI Scopus
期刊论文 | 2025 , 261 | Expert Systems with Applications
Abstract & Keyword Cite

Abstract :

For software evolution, user feedback has become a meaningful way to improve applications. Recent studies show a significant increase in analyzing end-user feedback from various social media platforms for software evolution. However, less attention has been given to the end-user feedback for low-rating software applications. Also, such approaches are developed mainly on the understanding of human annotators who might have subconsciously tried for a second guess, questioning the validity of the methods. For this purpose, we proposed an approach that analyzes end-user feedback for low-rating applications to identify the end-user opinion types associated with negative reviews (an issue or bug). Also, we utilized Generative Artificial Intelligence (AI), i.e., ChatGPT, as an annotator and negotiator when preparing a truth set for the deep learning (DL) classifiers to identify end-user emotion. For the proposed approach, we first used the ChatGPT Application Programming Interface (API) to identify negative end-user feedback by processing 71853 reviews collected from 45 apps in the Amazon store. Next, a novel grounded theory is developed by manually processing end-user negative feedback to identify frequently associated emotion types, including anger, confusion, disgust, distrust, disappointment, fear, frustration, and sadness. Next, two datasets were developed, one with human annotators using a content analysis approach and the other using ChatGPT API with the identified emotion types. Next, another round is conducted with ChatGPT to negotiate over the conflicts with the human-annotated dataset, resulting in a conflict-free emotion detection dataset. Finally, various DL classifiers, including LSTM, BILSTM, CNN, RNN, GRU, BiGRU and BiRNN, are employed to identify their efficacy in detecting end-users emotions by preprocessing the input data, applying feature engineering, balancing the data set, and then training and testing them using a cross-validation approach. We obtained an average accuracy of 94%, 94%, 93%, 92%, 91%, 91%, and 85%, with LSTM, BILSTM, RNN, CNN, GRU, BiGRU and BiRNN, respectively, showing improved results with the truth set curated with human and ChatGPT. Using ChatGPT as an annotator and negotiator can help automate and validate the annotation process, resulting in better DL performances. © 2024 Elsevier Ltd

Keyword :

Input output programs Application programs Emotion Recognition Software testing Requirements engineering Deep learning

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Khan, Nek Dil , Khan, Javed Ali , Li, Jianqiang et al. Leveraging Large Language Model ChatGPT for enhanced understanding of end-user emotions in social media feedbacks [J]. | Expert Systems with Applications , 2025 , 261 .
MLA Khan, Nek Dil et al. "Leveraging Large Language Model ChatGPT for enhanced understanding of end-user emotions in social media feedbacks" . | Expert Systems with Applications 261 (2025) .
APA Khan, Nek Dil , Khan, Javed Ali , Li, Jianqiang , Ullah, Tahir , Zhao, Qing . Leveraging Large Language Model ChatGPT for enhanced understanding of end-user emotions in social media feedbacks . | Expert Systems with Applications , 2025 , 261 .
Export to NoteExpress RIS BibTex
Monthly Streamflow Prediction of the Source Region of the Yellow River Based on Long Short-Term Memory Considering Different Lagged Months EI SCIE Scopus
期刊论文 | 2024 , 16 (4) | WATER
Abstract & Keyword Cite

Abstract :

Accurate and reliable monthly streamflow prediction plays a crucial role in the scientific allocation and efficient utilization of water resources. In this paper, we proposed a prediction framework that integrates the input variable selection method and Long Short-Term Memory (LSTM). The input selection methods, including autocorrelation function (ACF), partial autocorrelation function (PACF), and time lag cross-correlation (TLCC), were used to analyze the lagged time between variables. Then, the performance of the LSTM model was compared with three other traditional methods. The framework was used to predict monthly streamflow at the Jimai, Maqu, and Tangnaihai stations in the source area of the Yellow River. The results indicated that grid search and cross-validation can improve the efficiency of determining model parameters. The models incorporating ACF, PACF, and TLCC with lagged time are evidently superior to the models using the current variable as the model inputs. Furthermore, the LSTM model, which considers the lagged time, demonstrated better performance in predicting monthly streamflow. The coefficient of determination (R2) improved by an average of 17.46%, 33.94%, and 15.29% for each station, respectively. The integrated framework shows promise in enhancing the accuracy of monthly streamflow prediction, thereby aiding in strategic decision-making for water resources management.

Keyword :

monthly streamflow prediction data-driven models Yellow River LSTM lagged time analysis

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Chu, Haibo , Wang, Zhuoqi , Nie, Chong . Monthly Streamflow Prediction of the Source Region of the Yellow River Based on Long Short-Term Memory Considering Different Lagged Months [J]. | WATER , 2024 , 16 (4) .
MLA Chu, Haibo et al. "Monthly Streamflow Prediction of the Source Region of the Yellow River Based on Long Short-Term Memory Considering Different Lagged Months" . | WATER 16 . 4 (2024) .
APA Chu, Haibo , Wang, Zhuoqi , Nie, Chong . Monthly Streamflow Prediction of the Source Region of the Yellow River Based on Long Short-Term Memory Considering Different Lagged Months . | WATER , 2024 , 16 (4) .
Export to NoteExpress RIS BibTex
Ensemble Deep Learning Techniques for Advancing Breast Cancer Detection and Diagnosis EI Scopus
会议论文 | 2024 , 1134 , 181-192 | 13th International Conference on Frontier Computing, FC 2023
Abstract & Keyword Cite

Abstract :

The integration of deep learning (DL) and digital breast tomosynthesis (DBT) presents a unique opportunity to improve the reliability of breast cancer (BC) detection and diagnosis while accommodating novel imaging techniques. This study utilizes the publicly available Mammographic Image Analysis Society (MIAS) database v1.21 to evaluate DL algorithms in identifying and categorizing cancerous tissue. The dataset has undergone preprocessing and has been confirmed to be of exceptional quality. Transfer learning techniques are employed with three pre-trained models - MobileNet, Xception, DenseNet, and MobileNet LSTM - to improve performance on the target task. Stacking ensemble learning techniques will be utilized to combine the predictions of the best-performing models to make the final prediction for the presence of BC. The evaluation will measure the performance of each model using standard evaluation metrics, including accuracy (ACC), precision (PREC), recall (REC), and F1-score (F1-S). This study highlights the potential of DL in enhancing diagnostic imaging and advancing healthcare. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd 2024.

Keyword :

Learning systems Mammography Tomography Learning algorithms Diseases Long short-term memory

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Ibrahim, Adam M. , Hassan, Ayia A. , Li, Jianqiang et al. Ensemble Deep Learning Techniques for Advancing Breast Cancer Detection and Diagnosis [C] . 2024 : 181-192 .
MLA Ibrahim, Adam M. et al. "Ensemble Deep Learning Techniques for Advancing Breast Cancer Detection and Diagnosis" . (2024) : 181-192 .
APA Ibrahim, Adam M. , Hassan, Ayia A. , Li, Jianqiang , Pei, Yan . Ensemble Deep Learning Techniques for Advancing Breast Cancer Detection and Diagnosis . (2024) : 181-192 .
Export to NoteExpress RIS BibTex
Energy optimized data fusion approach for scalable wireless sensor network using deep learning-based scheme EI SCIE Scopus
期刊论文 | 2024 , 224 | JOURNAL OF NETWORK AND COMPUTER APPLICATIONS
Abstract & Keyword Cite

Abstract :

Energy efficiency and security are critical components of Quality of Service (QoS) and remain a challenge in WSN-assisted IoT owing to its open and resource -limited nature. Despite intensive research on WSN-IoT, only a few have achieved significant levels of energy efficiency and load balancing on clustering nodes. This study proposes a novel approach for dynamic cluster -based WSN-IoT networks to enhance the network's resilience using data fusion techniques and eliminate illogical clustering. The Mean Value and Minimum Distance Method identifies the optimal cluster heads within the network by reducing data redundancy, resulting in improved quality of service, energy optimization, and enhanced lifetime. The proposed fused deep learning -based data mining method (RNN-LSTM) mitigates the data fitting and enhances the dynamic routing and balancing load at the WSN fusion center. The novel approach splits the network into layers, assigning sensor nodes to each layer, drastically reducing latency, data transfers, and the fusion center's overhead. Distinct experiments evaluated the suggested approach's efficacy by varying the hidden layer nodes and signaling intervals. The empirical verdicts exhibit that the presented routing algorithms surpass state-of-the-art conventional routing systems in energy depletion, average latency, signaling overhead, cumulative throughput, and route heterogeneity.

Keyword :

RNN-LSTM Data fusion Inclusive innovation Internet of things Multi-hop clustering Energy balanced

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Mahmood, Tariq , Li, Jianqiang , Saba, Tanzila et al. Energy optimized data fusion approach for scalable wireless sensor network using deep learning-based scheme [J]. | JOURNAL OF NETWORK AND COMPUTER APPLICATIONS , 2024 , 224 .
MLA Mahmood, Tariq et al. "Energy optimized data fusion approach for scalable wireless sensor network using deep learning-based scheme" . | JOURNAL OF NETWORK AND COMPUTER APPLICATIONS 224 (2024) .
APA Mahmood, Tariq , Li, Jianqiang , Saba, Tanzila , Rehman, Amjad , Ali, Saqib . Energy optimized data fusion approach for scalable wireless sensor network using deep learning-based scheme . | JOURNAL OF NETWORK AND COMPUTER APPLICATIONS , 2024 , 224 .
Export to NoteExpress RIS BibTex
Enhanced forecasting of online car-hailing demand using an improved empirical mode decomposition with long short-term memory neural network EI SCIE SSCI Scopus
期刊论文 | 2024 | TRANSPORTATION LETTERS-THE INTERNATIONAL JOURNAL OF TRANSPORTATION RESEARCH
Abstract & Keyword Cite

Abstract :

The study on forecasting demand for online car-hailing holds substantial implications for both online car-hailing platforms and government agencies responsible for traffic management. This research proposes an enhanced Empirical Mode Decomposition Long-short Term Memory Neural Network (EMD-LSTM) model. EMD technique reduces noise and extracts stable intrinsic mode functions (IMF) from the original time series. Genetic algorithm is deployed to improve the K-Means clustering for determining optimal clusters. These sub time series serve as input for the prediction model, with combined results giving final predictions. Experimental data from Didi includes Haikou's car-hailing orders from May to October 2017 and Beijing's from January to May 2020. Results show improved EMD-LSTM reduces instability and captures characteristics better. Compared to unmodified EMD-LSTM, RMSE decreases by 3.50%, 6.81%, and 6.81% for the three datasets, and by 30.97%, 20%, and 9.24% respectively compared to single LSTM model.

Keyword :

EMD online car-hailing demand forecasting LSTM improved K-Means

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Liu, Jiaming , Tang, Xiaoya , Liu, Haibin . Enhanced forecasting of online car-hailing demand using an improved empirical mode decomposition with long short-term memory neural network [J]. | TRANSPORTATION LETTERS-THE INTERNATIONAL JOURNAL OF TRANSPORTATION RESEARCH , 2024 .
MLA Liu, Jiaming et al. "Enhanced forecasting of online car-hailing demand using an improved empirical mode decomposition with long short-term memory neural network" . | TRANSPORTATION LETTERS-THE INTERNATIONAL JOURNAL OF TRANSPORTATION RESEARCH (2024) .
APA Liu, Jiaming , Tang, Xiaoya , Liu, Haibin . Enhanced forecasting of online car-hailing demand using an improved empirical mode decomposition with long short-term memory neural network . | TRANSPORTATION LETTERS-THE INTERNATIONAL JOURNAL OF TRANSPORTATION RESEARCH , 2024 .
Export to NoteExpress RIS BibTex
CO2 Emission Prediction Method in MSWI Process Based on LSTM Compensated With ARIMA EI Scopus
会议论文 | 2024 , 2357-2362 | 36th Chinese Control and Decision Conference, CCDC 2024
Abstract & Keyword Cite

Abstract :

Municipal solid waste incineration (MSWI) processes emit the greenhouse gas carbon dioxide (CO2), contributing to global atmospheric warming. In order to achieve the dual carbon goal and protect the ecological environment, it is imperative to predict CO2 emission concentrations and implement proactive control measures. Addressing these concerns, this study introduces a CO2 emission prediction model for the MSWI process based on the LSTM-compensated ARIMA model. Initially, the ARIMA model serves as the primary predictor for CO2 emissions and calculates its prediction residuals. Subsequently, the LSTM model functions as a compensatory model, utilizing the predicted residuals as input truth values for constructing predictions. Finally, the predicted values from the primary and compensatory models are weighted and combined to yield the ultimate result. Experimental results, conducted using data from an MSWI plant in Beijing, demonstrate the efficacy of this approach. © 2024 IEEE.

Keyword :

Waste incineration Carbon dioxide Long short-term memory Municipal solid waste Forecasting Greenhouse gases

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Wang, Zi , Tang, Jian , Xia, Heng et al. CO2 Emission Prediction Method in MSWI Process Based on LSTM Compensated With ARIMA [C] . 2024 : 2357-2362 .
MLA Wang, Zi et al. "CO2 Emission Prediction Method in MSWI Process Based on LSTM Compensated With ARIMA" . (2024) : 2357-2362 .
APA Wang, Zi , Tang, Jian , Xia, Heng , Zhang, Runyu , Wang, Tianzheng , Wu, Zhiwei . CO2 Emission Prediction Method in MSWI Process Based on LSTM Compensated With ARIMA . (2024) : 2357-2362 .
Export to NoteExpress RIS BibTex
Runoff projection in the Tibetan Plateau using a long short-term memory network-based framework under various climate scenarios EI SCIE Scopus
期刊论文 | 2024 , 632 | JOURNAL OF HYDROLOGY
Abstract & Keyword Cite

Abstract :

Runoff projections in the Tibetan Plateau (TP) can provide a valuable basis for making decisions regarding water resource management and facilitating the assessment of potential risks to water security. This study estimated the runoff of the Tibetan Plateau (TP) at a grid resolution of 0.5 degrees for the near-term period (2022-2051) and the long-term period (2052-2082) under various climate scenarios. A framework based on long short-term memory (LSTM) was used to project the impact of climate change on runoff by integrating input selection, LSTM modelling and runoff projection application. This novel framework was used to determine the variables that contribute to runoff occurrence in different grid cells, analyze the relationship between these variables and runoff, and predict how runoff may change in the future due to precipitation, temperature, and antecedent runoff under various climate scenarios. The projected results showed that the runoff would be approximately 450 mm/a for the period from 2022 to 2051 and approximately 465 mm/a during the period from 2052 to 2082, representing an increase of about 46 % compared to the period from 1982 to 2012. Future runoff will increase in a decreasing pattern from the southwest to the northeast across the TP region. These findings regarding the projected increase in runoff will assist in the development of proactive adaptation strategies and support long-term economic growth through effective water resource management plans.

Keyword :

Runoff projection Tibetan Plateau Long short-term memory network Grid scale Climate scenarios

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Chu, Haibo , Wei, Jiahua , Wang, Hao et al. Runoff projection in the Tibetan Plateau using a long short-term memory network-based framework under various climate scenarios [J]. | JOURNAL OF HYDROLOGY , 2024 , 632 .
MLA Chu, Haibo et al. "Runoff projection in the Tibetan Plateau using a long short-term memory network-based framework under various climate scenarios" . | JOURNAL OF HYDROLOGY 632 (2024) .
APA Chu, Haibo , Wei, Jiahua , Wang, Hao , Zhou, Jinjun . Runoff projection in the Tibetan Plateau using a long short-term memory network-based framework under various climate scenarios . | JOURNAL OF HYDROLOGY , 2024 , 632 .
Export to NoteExpress RIS BibTex
Dynamic System Modeling Using a Multisource Transfer Learning-Based Modular Neural Network for Industrial Application EI SCIE Scopus
期刊论文 | 2024 , 20 (5) , 7173-7182 | IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS
WoS CC Cited Count: 1
Abstract & Keyword Cite

Abstract :

Establishing an accurate model of dynamic systems poses a challenge for complex industrial processes. Due to the ability to handle complex tasks, modular neural networks (MNN) have been widely applied to industrial process modeling. However, the phenomenon of domain drift caused by operating conditions may lead to a cold start of the model, which affects the performance of MNN. For this reason, a multisource transfer learning-based MNN (MSTL-MNN) is proposed in this study. First, the knowledge-driven transfer learning process is performed with domain similarity evaluation, knowledge extraction, and fusion, aiming to form an initial subnetwork in the target domain. Then, the positive transfer process of effective knowledge can avoid the cold start problem of MNN. Second, during the data-driven fine-tuning process, a regularized self-organizing long short-term memory algorithm is designed to fine-tune the structure and parameters of the initial subnetwork, which can improve the prediction performance of MNN. Meanwhile, relevant theoretical analysis is given to ensure the feasibility of MSTL-MNN. Finally, the effectiveness of the proposed method is confirmed by two benchmark simulations and a real industrial dataset of a municipal solid waste incineration process. Experimental results demonstrate the merits of MSTL-MNN for industrial applications.

Keyword :

Computational modeling Dynamic system Task analysis multisource transfer learning Mathematical models modular neural network (MNN) Multi-layer neural network Prediction algorithms Dynamical systems Neurons long short-term memory (LSTM)

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Duan, Haoshan , Meng, Xi , Tang, Jian et al. Dynamic System Modeling Using a Multisource Transfer Learning-Based Modular Neural Network for Industrial Application [J]. | IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS , 2024 , 20 (5) : 7173-7182 .
MLA Duan, Haoshan et al. "Dynamic System Modeling Using a Multisource Transfer Learning-Based Modular Neural Network for Industrial Application" . | IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS 20 . 5 (2024) : 7173-7182 .
APA Duan, Haoshan , Meng, Xi , Tang, Jian , Qiao, Junfei . Dynamic System Modeling Using a Multisource Transfer Learning-Based Modular Neural Network for Industrial Application . | IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS , 2024 , 20 (5) , 7173-7182 .
Export to NoteExpress RIS BibTex
A Trusted Authentication Scheme Using Semantic LSTM and Blockchain in IoT Access Control System EI SCIE Scopus
期刊论文 | 2024 , 20 (1) | INTERNATIONAL JOURNAL ON SEMANTIC WEB AND INFORMATION SYSTEMS
WoS CC Cited Count: 1
Abstract & Keyword Cite

Abstract :

In edge computing scenarios, due to the wide distribution of devices, complex application environments, and limited computing and storage capabilities, their authentication and access control efficiency is low. To address the above issues, a secure trusted authentication scheme based on semantic Long Short-Term Memory (LSTM) and blockchain is proposed for IoT applications. The attribute-based access control model is optimized, combining blockchain technology with access control models, effectively improving the robustness and credibility of access control systems. Semantic LSTM is used to predict environmental attributes that can further restrict user access and dynamically meet the minimum permission granting requirements. Experiments show that when the number of certificates is 60, the computational overhead of the proposed method is only 203s, which is lower than other state-of-the-art methods. Therefore, the performance of the proposed schema in information security protection in IoT environments shows promise as a scalable authentication solution for IoT applications.

Keyword :

Attribute-based access control Long Short-Term Memory Blockchain Edge computing Internet of Things

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Zhao, Ge , Li, Xiangrong , Li, Hao . A Trusted Authentication Scheme Using Semantic LSTM and Blockchain in IoT Access Control System [J]. | INTERNATIONAL JOURNAL ON SEMANTIC WEB AND INFORMATION SYSTEMS , 2024 , 20 (1) .
MLA Zhao, Ge et al. "A Trusted Authentication Scheme Using Semantic LSTM and Blockchain in IoT Access Control System" . | INTERNATIONAL JOURNAL ON SEMANTIC WEB AND INFORMATION SYSTEMS 20 . 1 (2024) .
APA Zhao, Ge , Li, Xiangrong , Li, Hao . A Trusted Authentication Scheme Using Semantic LSTM and Blockchain in IoT Access Control System . | INTERNATIONAL JOURNAL ON SEMANTIC WEB AND INFORMATION SYSTEMS , 2024 , 20 (1) .
Export to NoteExpress RIS BibTex
Joint Task Offloading and Content Caching for NOMA-Aided Cloud-Edge-Terminal Cooperation Networks EI SCIE Scopus
期刊论文 | 2024 , 23 (10) , 15586-15600 | IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS
WoS CC Cited Count: 4
Abstract & Keyword Cite

Abstract :

To satisfy the requirements of content distribution in computation-intensive and delay-sensitive services, this paper presents a novel joint task offloading and content caching (JTOCC) scheme in multi-cell multi-carrier non-orthogonal multiple-access (MCMC-NOMA)-assisted cloud-edge-terminal cooperation networks. Based on queuing theory, we formulate a delay minimization model that aggregates users' requests to reduce repeated content delivery. To minimize network latency, the model is decomposed into three subproblems: task offloading, user clustering and communication resource allocation, and cache state updating. In each slot, the task offloading subproblem is solved utilizing deep reinforcement learning (DRL) under a resource-constrained cloud-edge-terminal setting. During a transition between slots, mobile terminals are grouped using K-means-based user clustering, and the allocations of the subchannels and transmit power are optimized utilizing matching theory and successive convex approximation (SCA), respectively. Contents cached at the network nodes are updated, according to long-short-term memory (LSTM)-based predicted popularity. Simulations show that the proposed JTOCC model achieves lower-delay content distribution than its existing counterparts in cloud-edge-terminal cooperation environments, and converges fast in heterogeneous networks.

Keyword :

Predictive models content caching resource allocation Optimization Delays NOMA multi-cell multi-carrier non-orthogonal multiple access Cloud-edge-terminal cooperation task offloading Computational modeling Task analysis Resource management

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Fang, Chao , Xu, Hang , Zhang, Tianyi et al. Joint Task Offloading and Content Caching for NOMA-Aided Cloud-Edge-Terminal Cooperation Networks [J]. | IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS , 2024 , 23 (10) : 15586-15600 .
MLA Fang, Chao et al. "Joint Task Offloading and Content Caching for NOMA-Aided Cloud-Edge-Terminal Cooperation Networks" . | IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS 23 . 10 (2024) : 15586-15600 .
APA Fang, Chao , Xu, Hang , Zhang, Tianyi , Li, Yingshan , Ni, Wei , Han, Zhu et al. Joint Task Offloading and Content Caching for NOMA-Aided Cloud-Edge-Terminal Cooperation Networks . | IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS , 2024 , 23 (10) , 15586-15600 .
Export to NoteExpress RIS BibTex
10| 20| 50 per page
< Page ,Total 40 >

Export

Results:

Selected

to

Format:
Online/Total:194/5921043
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.