These UT researchers investigated a type of AI model called sequence to sequence (Seq2Seq) with attention model.

A typical sequence to sequence model has two parts – an encoder and a decoder. Both of these parts are practically two different neural network models combined into one giant network.

This means the task of an encoder network is to understand the input sequence and create a smaller dimensional representation of it.

When integrating Google Trends data to gauge consumer interest in influenza at any given point in time during the flu season, with ‘dark’ statistics compiled from hospitals by the U.S. Centers for Disease Control and Prevention (CDC), actionable insights are revealed.

They used the unweighted percentage of the people infected with influenza-like illnesses (ILI) disclosed by the CDC as the number of people infected by influenza.

The UT team then used the unweighted percentage of people infected with ILI across 6 states selected for their climate diversity from October 2010, to December 2018 (430 weeks).

This study demonstrated that the Seq2Seq model with attention had a “significantly higher” Pearson correlation, which is the measure of the linear correlation between 2 variables.

They demonstrate that the ‘attention’ mechanism is highly effective and improves prediction accuracy, with a Pearson correlation and root-mean-square error of 0.996 and 0.67, respectively.

These researchers said ‘According to our knowledge, a Pearson correlation coefficient of 0.996 is a state-of-the-art result owing to time dependencies.’

The researchers caution that the peak value in this model shifted downward as the prediction-time increases because the peak-time couldn’t be predicted from the AI learning data when the flu epidemic subsides.

The experimental results show that the attention mechanism is extremely effective to predict the prevalence of influenza.

When influenza activity is not at its peak, the long range of ILI depends on prediction. In contrast, during the peak activity season of influenza, the latest values are valid for prediction. The attention mechanism solves this time dependency.

However, to resolve this shortcoming, the UT researchers believe the addition of a leading indicator might further improve this model's accuracy.