**5. Results and discussion**

*Artificial Intelligence - Latest Advances, New Paradigms and Novel Applications*

**Dataset Number of classes** *Scratch GoogLeNet*

**Type of dataset Dataset Number of classes Number of images** Pretrained Food-101 101 101,000 Pretrained UEC-256 256 31,395 Pretrained Fruit-360 81 77,917 Test UNIMIB2016 33 1,047

50 DNN2

101 DNN3

128 DNN2

256 DNN3

40 DNN2

81 DNN3

food DNN1

food DNN2

food DNN3

uec DNN1

uec DNN2

uec DNN3

fruit DNN1

fruit DNN2

fruit DNN3

food\*

food\*

food\*

uec\*

uec\*

uec\*

fruit\*

fruit\*

fruit\*

food-101 34 DNN1

uec-256 85 DNN1

fruit-360 27 DNN1

**No. Number of DNNs Type of DNN Dataset** 1 1 *GoogLeNet* Food-101

4 1 *Scratch* Food-101

1 Predicted label by integrated DNN using *GoogLeNet* 2 Predicted label by integrated DNN made from *Scratch*

2 2 *GoogLeNet* Food-101 + UEC-256

5 2 *Scratch* Food-101 + UEC-256

3 3 *GoogLeNet* Food-101 + UEC-256 + Fruit-360

6 3 *Scratch* Food-101 + UEC-256 + Fruit-360

*Deep neural network with different number of classes.*

To evaluate the performance of integrated deep neural network, we created two single DNNs and four integrated DNNs as shown in **Table 3**. The *UNIMIB2016* dataset for the test was employed. Recognition accuracy then was calculated using

**80**

Eq. (3).

**Table 4.**

**Table 2.**

**Table 3.**

*Integrated deep neural network.*

*Food labels used for association rule.*

**Types of food label**

3 Corrected label (no prediction)

**Table 1.**

*Dataset of food images.*

**Figure 7** shows the recognition accuracy of DNN by food-101 dataset, uec-256 dataset, and fruit-360 dataset respectively. There was a relationship between the recognition accuracy of DNN and the increase of the number of class for each database. It was found that the recognition accuracy of DNN slightly decreased according to the increase of the number of recognition classes. In addition, the performance of DNNs where *GoogLeNet* applied was higher than the DNNs made from *Scratch*.

**Figure 8** shows the result of integrated deep neural network. It implied that if the number of networks was increased, the recognition accuracy of the integrated

**Figure 7.** *Recognition results of food-101, uec-256 and fruit-360.*

**Figure 8.** *Result of integrated deep neural network.*

DNNs would be enhanced. In addition, it was found that the performance of DNNs where *GoogLeNet* applied was higher than the DNNs made from *Scratch*. It is because the proposed reliability score allowed the integrated DNNs to select a suitable recognition result obtained from the three different networks, which were constructed from the food-101, the uec-256 and the fruit-360 dataset. Note that although the DNN constructed from the fruit-360 dataset showed the high accuracy of 97.72% for 10 classes recognition, it could not return a suitable result for an image of general food other than a fruit. This is the disadvantage of a DNN constructed from a specific image dataset.

**Tables 5**–**7** show the results of food labels where the association rules were sampled from the whole results. It was found that the number of bad and good results for "Kiwi, Doughnut" was almost equal. While "Pineapple mini and Icecream" in **Table 6** gave better results. It was confirmed that the number of good results in the integrated networks using *GoogleNet* was preferable. In **Table 7**, "Steak and French fries" showed good result relevant to the test data.


#### **Table 5.**

*Results of integrated DNNs made from Scratch.*


**83**

*A Food Recommender Based on Frequent Sets of Food Mining Using Image Recognition*

**Antecedents Consequents Result** Steak, French fries Banana, Roll bread Good Steak, French fries Baklava, Orange Good Steak, French fries Roll bread, Orange Good Steak, French fries Spaghetti, Orange Good Steak, French fries Baklava, Roll bread Bad Steak, French fries Spaghetti, Baklava Good Steak, French fries Spaghetti, Roll bread Good Steak, French fries Yogurt, Roll bread Good Steak, French fries Spaghetti, Yogurt Good Steak, French fries Spaghetti Good

In food recognition phase, integrated networks (DNNs) showed higher recognition accuracy (80%) than a single network. Since the proposed reliability score allowed the integrated networks to select a suitable recognition result obtained from the different network with different domains. The performance of networks where *GoogLeNet* applied gave higher recognition accuracy. In addition, it was found that when we used the test dataset different from the trained dataset, we could not get

In Data mining phase, we could extract some meaningful rules by applying the *Apriori* algorithm to recognize the results of canteen image dataset. In our future work, we will modify this system in recognition phase and will increase the performance of the networks. We will evaluate the effectiveness of modified system using bigger size of food data. In addition, we are developing visual food mining using mixed model of DNN and RNN (recurrent neural networks) for continuing our

This research is a part of academic cooperation between Chulalongkorn University, Thailand and Kanagawa Institute of Technology (KAIT), Japan. Authors would like to thank KAIT for providing scholarship and deeply appreciate Professor

Kosuke Takano for his advice and facility in his Laboratory at KAIT.

The authors declare no potential conflict of interest.

*DOI: http://dx.doi.org/10.5772/intechopen.97186*

**6. Conclusion**

*Results of corrected labels.*

**Table 7.**

the suitable results.

**Acknowledgements**

**Conflict of interest**

research.

#### **Table 6.**

*Results of integrated DNN using GoogLeNet.*

*A Food Recommender Based on Frequent Sets of Food Mining Using Image Recognition DOI: http://dx.doi.org/10.5772/intechopen.97186*


**Table 7.**

*Artificial Intelligence - Latest Advances, New Paradigms and Novel Applications*

and French fries" showed good result relevant to the test data.

from a specific image dataset.

DNNs would be enhanced. In addition, it was found that the performance of DNNs where *GoogLeNet* applied was higher than the DNNs made from *Scratch*. It is because the proposed reliability score allowed the integrated DNNs to select a suitable recognition result obtained from the three different networks, which were constructed from the food-101, the uec-256 and the fruit-360 dataset. Note that although the DNN constructed from the fruit-360 dataset showed the high accuracy of 97.72% for 10 classes recognition, it could not return a suitable result for an image of general food other than a fruit. This is the disadvantage of a DNN constructed

**Tables 5**–**7** show the results of food labels where the association rules were sampled from the whole results. It was found that the number of bad and good results for "Kiwi, Doughnut" was almost equal. While "Pineapple mini and Icecream" in **Table 6** gave better results. It was confirmed that the number of good results in the integrated networks using *GoogleNet* was preferable. In **Table 7**, "Steak

**Antecedents Consequents Result** Kiwi, Doughnut Pineapple, Orange Good Kiwi, Doughnut Orange, Miso soup Bad Kiwi, Doughnut Spaghetti Bolognese, Orange Good Kiwi, Doughnut Orange, Spam musubi Bad Kiwi, Doughnut Pineapple, Miso soup Bad Kiwi, Doughnut Pineapple, Spaghetti Bolognese Good Kiwi, Doughnut Pineapple, Spam musubi Bad Kiwi, Doughnut Spaghetti Bolognese, Miso soup Bad Kiwi, Doughnut Spam musubi, Miso soup Good Kiwi, Doughnut Spaghetti Bolognese, Spam musubi Bad

**Antecedents Consequents Result** Pineapple mini, Ice-cream Apple red/yellow Good Pineapple mini, Ice-cream Cantaloupe Good Pineapple mini, Ice-cream Cocos Good Pineapple mini, Ice-cream French loaf Good Pineapple mini, Ice-cream Granadilla Bad Pineapple mini, Ice-cream Grapefruit pink Good Pineapple mini, Ice-cream Kiwi Good Pineapple mini, Ice-cream Peach flat Good Pineapple mini, Ice-cream Peach abate Good Pineapple mini, Ice-cream Pitahaya Red Good

**82**

**Table 6.**

**Table 5.**

*Results of integrated DNNs made from Scratch.*

*Results of integrated DNN using GoogLeNet.*

*Results of corrected labels.*
