Advertisement · 728 × 90
#
Hashtag
#LSTM
Advertisement · 728 × 90
Preview
Long Short-Term Memory–GPT-4 Integration for Interpretable Biomedical Signal Classification: Proof-of-Concept Study Background: Approximately 3.8 billion people lack access to essential health services, and diagnostic interpretation remains a major bottleneck in remote and resource-constrained settings. Limited access to specialists and the complexity of biomedical signal interpretation (eg, electrocardiogram [ECG] and electroencephalogram) contribute to delays in recognizing cardiovascular and neurological conditions. Objective: The study aimed to develop and evaluate a technical framework integrating long short-term memory (LSTM) networks with GPT-4 to provide automated biomedical signal classification and human-readable interpretations, suitable as a foundation for future deployment in resource-constrained environments. Methods: The 2-layer LSTM architecture (128→64 units) was selected based on preliminary experiments comparing configurations ranging from single-layer networks (64, 128 units) to deeper architectures (128→64→32 units). The chosen configuration balanced model capacity against overfitting risk and computational efficiency. The framework was evaluated using public PhysioNet datasets: Massachusetts Institute of Technology–Beth Israel Hospital (MIT-BIH) Arrhythmia, Physikalisch-Technische Bundesanstalt (PTB) Diagnostic ECG, Physikalisch-Technischen Bundesanstalt-extra large, Chapman-Shaoxing, Medical Information Mart for Intensive Care-III Waveforms, and Sleep-European data format. A patient-level split protocol (70/15/15) was used to reduce leakage risk. The LSTM architecture (128→64 units) performed temporal feature extraction with softmax-based classification for mutually exclusive classes. GPT-4 was integrated via an application programming interface with structured prompts to generate clinical interpretations from model outputs. Results: For the expert evaluation, we randomly sampled 50 test cases per dataset (150 total: 30 from each class for MIT-BIH, 25 per class for PTB, and 20 per class for Children's Hospital Boston-Massachusetts Institute of Technology), ensuring balanced class representation. Three board-certified physicians (2 cardiologists for ECG datasets and 1 neurologist for the electroencephalogram dataset) independently reviewed GPT-4–generated interpretations. Reviewers were blinded to whether signals were correctly or incorrectly classified by the LSTM model. Each interpretation was rated on a 5-point Likert scale (1=clinically inappropriate and 5=highly accurate and clinically useful). Interrater reliability was assessed using Fleiss κ (0.78, substantial agreement). On held-out test sets, classification performance was as follows: MIT-BIH 92.3% accuracy (=0.91, AUC=0.95), PTB Diagnostic 94.7% (=0.94, AUC=0.97), Physikalisch-Technischen Bundesanstalt-extra large 88.9% (=0.88, AUC=0.93), Chapman-Shaoxing 91.2% (=0.90, AUC=0.94), Medical Information Mart for Intensive Care-III 89.5% (=0.89, AUC=0.92), and Sleep-European data format 87.3% (=0.86, AUC=0.91). Expert evaluation of generated interpretations (3 board-certified cardiologists) rated clinical accuracy 4.3 out of 5, clarity 4.6 out of 5, and actionability 4.2 out of 5, with strong interrater agreement (κ>0.85). Conclusions: This proof-of-concept demonstrates an explicit methodological integration of deep learning–based biomedical signal classification with GPT-4–based interpretation, provides a technical foundation for future prospective clinical validation, field studies, and regulatory review prior to clinical deployment in underserved settings.

JMIR Formative Res: Long Short-Term Memory–GPT-4 Integration for Interpretable Biomedical Signal Classification: Proof-of-Concept Study #BiomedicalEngineering #HealthTech #MachineLearning #AIinHealthcare #LSTM

1 0 0 0

TRecViT: A Recurrent Video Transformer

Viorica Patraucean, Xu Owen He, Joseph Heyward et al.

Action editor: Adín Ramírez Rivera

https://openreview.net/forum?id=Mmi46Ytb1H

#lstm #memory #attention

1 0 0 0
Preview
Continuous Authentication in VR Environments Using LSTM: An Experimental Study - Premier Science Continuous authentication, Head-motion biometrics, LSTM-based user verification, Deterministic tokenization, Privacy-preserving sensor encryption.

doi.org/10.70389/PJS...

#VirtualReality #VR #LSTM

0 1 0 0
Preview
The Hard Truth About Machine Learning for Amazon FBA Sellers Amazon FBA demand forecasting breaks because the data is sparse, messy, and constantly shifting. Prophet and vanilla LSTMs often overfit and collapse under seasonality shifts. Real gains come from better feature engineering, TCNs with attention, Ray Tune + ASHA optimization, drift detection, and FBA-specific metrics like stockout penalties. In 2026, hybrid ML + RAG systems are becoming the only durable approach.

The Hard Truth About Machine Learning for Amazon FBA Sellers

Amazon FBA demand forecasting breaks because the data is sparse, messy, and constantly shifting. Prophet and vanilla LSTMs often overfit and collapse under seasonality shifts. Real g…

Telegram AI Digest
#lstm #machinelearning #rayproject

1 0 0 0
Post image Post image Post image Post image

📢Artificial intelligence-driven #precipitation #downscaling and projections over #Thailand using #CMIP6 climate models
👉https://doi.org/10.1080/20964471.2025.2547500
💌 #AI #DyNN-Mem #LSTM #CNN #Deeplearning #machinelearning #CMIP6 #hydrology #climatechange #remotesensing #WaterResources

2 0 0 0
Preview
태양광 데이터, 아직도 노가다? 기상청 무료 API로 1년치 일사량 10분 만에 자동 수집! 코딩 몇 줄로 전국 50개 지역 1년치 태양광 데이터를 쓸어 담는 비법. 엑셀은 이제 그만! 기상청 관측-통계 묶음형 API 완전 정복 가이드.

코딩 몇 줄로 전국 50개 지역 1년치 태양광 데이터를 쓸어 담는 비법. 엑셀은 이제 그만! 기상청 관측-통계 묶음형 API 완전 정복 가이드.


#AI #LSTM #기상청API #데이터분석 #일사량 #일조 #태양광데이터 #파이썬
doyouknow.kr/kma-solar-ra...

0 0 0 0
Preview
30년의 기후 기억, 파이썬으로 1초 만에 불러내기 (feat. 기상청 평년값 API) 기상청 평년값 API로 30년 기후 데이터를 파이썬으로 수집하고, LSTM 모델로 기후변화를 예측하세요. 이상기후 판단부터 AI 모델링까지 완벽 가이드!

기상청 평년값 API로 30년 기후 데이터를 파이썬으로 수집하고, LSTM 모델로 기후변화를 예측하세요. 이상기후 판단부터 AI 모델링까지 완벽 가이드!


#LSTM #공공데이터 #기상청API #기후데이터 #기후변화 #데이터분석 #파이썬 #평년값
doyouknow.kr/kma-normal-v...

0 0 0 0
Post image Post image Post image

Today, we’re proud to celebrate Dr. Hauwa Mohammed on earning her PhD in Global Health, building on many other amazing achievements. Her dedication, and passion for improving maternal and newborn health inspire us all.

Congratulations, Dr. Hauwa!

#lstm #lstmgraduation2025

1 0 0 1

Haven’t been posting much recently as work has been extra busy preparing for my move next year. Now we are in November it’s starting to feel real and close! Time to get excited about #LSTM @lstmnews.bsky.social

6 0 0 0
Post image

Optimization of Forecasting Performance in the Retail Sector Using Artificial Intelligence
www.mdpi.com/2673-4591/11...

By Hoda Jatte et al.
From the ICATH 2025 Conference

#DemandForecasting #AI #MachineLearning #LSTM

1 0 0 0
Federated Learning Framework Enhances Spatiotemporal Forecasting

Federated Learning Framework Enhances Spatiotemporal Forecasting

A new federated learning framework swaps GRU for LSTM and adds client‑side validation, boosting spatiotemporal forecasting accuracy; the revision was released on 1 Oct 2025. Read more: getnews.me/federated-learning-frame... #federatedlearning #lstm

0 0 0 0
IntrusionX Hybrid CNN‑LSTM Framework Boosts Network Intrusion Detection

IntrusionX Hybrid CNN‑LSTM Framework Boosts Network Intrusion Detection

IntrusionX, a CNN‑LSTM IDS tuned with the Squirrel Search Algorithm, reached 98% binary accuracy on the NSL‑KDD benchmark, with 71% recall for U2R attacks. Read more: getnews.me/intrusionx-hybrid-cnn-ls... #intrusiondetection #cnn #lstm

0 0 0 0
LSTM, Random Forest and XGBoost Compared for Solar & Wind Forecasting

LSTM, Random Forest and XGBoost Compared for Solar & Wind Forecasting

A study compares LSTM, Random Forest and XGBoost for solar and wind forecasts, noting Random Forest offers faster inference while LSTM is the most complex model. Read more: getnews.me/lstm-random-forest-and-x... #lstm #randomforest #xgboost

0 0 0 0
Post image

#DTS
Waymo data + LSTM expose lane-change intent: adding Vehicle Operating Space lifts accuracy and recall.🛣️🚙
Details: www.maxapress.com/article/doi/10.48130/DTS...
#vehiclegram #lstm #vehicledesign #lanechange #intent

0 0 0 0
LSTM Model Improves Short-Term Electricity Forecasting in Argentina

LSTM Model Improves Short-Term Electricity Forecasting in Argentina

An LSTM model achieved a 3.20% MAPE and 0.95 R² in forecasting hourly electricity demand for Córdoba, Argentina, helping grid operators improve short‑term planning. Read more: getnews.me/lstm-model-improves-shor... #lstm #cordoba

0 0 0 0
DeepACTIF Enables Fast Feature Attribution for Neural Sequence Models

DeepACTIF Enables Fast Feature Attribution for Neural Sequence Models

DeepACTIF provides feature ranking for LSTM models, enabling on‑device explanations. Tested on three biometric gaze datasets, it kept accuracy with the top 10 % of features. Read more: getnews.me/deepactif-enables-fast-f... #deepactif #lstm #edgeai

0 0 0 0
Transformers Beat LSTMs in Multi-Class Mental Health Classification

Transformers Beat LSTMs in Multi-Class Mental Health Classification

Transformers outperformed LSTMs in multi‑class mental‑health classification, with RoBERTa hitting 91%‑99% F1 and attention‑augmented LSTMs training 2–3.5× faster. Read more: getnews.me/transformers-beat-lstms-... #transformers #lstm #mentalhealth

0 0 0 0

Good read: www.mdpi.com/2571-9394/6/...
"Data-Centric Benchmarking of Neural Network Architectures for the Univariate Time Series Forecasting Task"

#timeseries #LSTM #realworlddata #neuralnetworks

2 0 0 0
Explainable Unsupervised Multi-Anomaly Detection in Nuclear Reactor Data

Explainable Unsupervised Multi-Anomaly Detection in Nuclear Reactor Data

A dual‑attention LSTM autoencoder detects and localizes anomalies in nuclear reactor sensor data, identifying affected sensors and timing on the PUR‑1 dataset. Read more: getnews.me/explainable-unsupervised... #nuclear #anomalydetection #lstm

0 0 0 0
SyntaxFest 2025 | Ljubljana, Slovenia

The team are heading to Slovenia on Monday to attend the biannual #syntaxfest

DM to meet and discuss #LSTM architectures and the application of the CoNLL-U standard to #lowresourcelanguages and #indigenouslanguage

#nolanguageleftbehind

syntaxfest.github.io/syntaxfest25/

2 2 1 0
Improving OCR Accuracy in Historical Archives with Deep Learning Historical OCR has long struggled with noisy scans, rare fonts, and degraded texts. Recent research shows that deep learning approaches—like LSTM networks trained on gray-level data, mixed models spanning centuries of typefaces, and CNN-LSTM hybrids—significantly improve recognition accuracy. New datasets, open-source systems like anyOCR, and tools such as Calamari and Tesseract 4 push OCR closer to human-level performance, achieving accuracy rates as high as 98%. Together, these advancements are transforming how historical archives and rare printings are digitized and preserved for the digital age.

Improving OCR Accuracy in Historical Archives with Deep Learning

Historical OCR has long struggled with noisy scans, rare fonts, and degraded texts. Recent research shows that deep learning approaches—like LSTM networks trained on gray-level data, mixed models spanni…

#accuracy #deeplearning #lstm

0 0 0 0
What If Your Unique Typing Style Could Become Your Seamless Password? Design a keystroke pattern based authenticator using ML methods like CNN/RNN-LSTM with real world examples and code.

What If Your Unique Typing Style Could Become Your Seamless Password?

Design a keystroke pattern based authenticator using ML methods like CNN/RNN-LSTM with real world examples and code.

#cnn #lstm #rnn

1 0 0 1
Mechanistic View of Transformers: Patterns, Messages, Residual Stream… and LSTMs

Механистический взгляд на Трансформеры: Шаблоны, Сообщения, Остаточный поток... и LSTM

Что происходит, когда вы перестаете объединять и начинаете разлагать: новый взгляд на внимание.

#ai #lstm #transformer

0 1 0 0
Mechanistic View of Transformers: Patterns, Messages, Residual Stream… and LSTMs What happens when you stop concatenating and start decomposing: a new way to think about attention.

Mechanistic View of Transformers: Patterns, Messages, Residual Stream… and LSTMs

What happens when you stop concatenating and start decomposing: a new way to think about attention.

#ai #lstm #transformer

0 1 0 0
Post image

A new hybrid #DeepLearning model combining #CNN and #LSTM is achieving 96.06% accuracy in classifying motor imagery #EEG data. Paving the way for faster, more precise #brain-computer interfaces, it could be a big step toward real-time #BCI applications. Details at #ScientificReports: bit.ly/3U1RyBV

1 0 0 0
Time Series Is Everywhere—Here’s How to Actually Forecast It Time series isn’t just about ARIMA models anymore. LSTM, GRU, and even Q-learning can forecast prices, detect faults, and outsmart classic baselines. This article dives into how—and why—you should care.

Time Series Is Everywhere—Here’s How to Actually Forecast It

Time series isn’t just about ARIMA models anymore. LSTM, GRU, and even Q-learning can forecast prices, detect faults, and outsmart classic baselines. This article dives into how—and why—you should care.

#ai #lstm #news

0 0 0 0
Preview
Titans: Neural Long-Term Memory for Enhanced Sequence Modeling There’s been a lot of noise lately about scaling, trillion-parameter models, and ultra-long context lengths. But while most attention went…

Titans: Neural Long-Term Memory for Enhanced Sequence Modeling There’s been a lot of noise lately about scaling, trillion-parameter models, and ultra-long context lengths. But while most attentio...

#llm #machine-learning #lstm #long-term-memory #titan

Origin | Interest | Match

0 0 0 0