**3.1. Improving the product recommendations using rule-based sentiments from ontology**

The rule-based sentiments mined from the PROO ontology specify the relations between the parent feature and the child feature. It also reveals the relations among the related product features. The opinion strength of the feature for which the sentiment is to be determined by the machine also carries its importance in the rule. Also the sentiments calculated for each of the product features after extracting from the reviews are stored separately for further mapping. The detailed procedure for improving product recommendations is expressed in step-by-step form below. The symbols used in the steps are as follows: O is the PROO ontology. Pi is the product and i = 1,2,3,… The sentiment of the product feature F<sup>j</sup> of the product Pi is represented as Sentiment(F<sup>j</sup> ,Pi ) where j = 1,2,3,… The Pos(F<sup>j</sup> ,Pi ), Neg(F<sup>j</sup> ,Pi ), and Neu(F<sup>j</sup> ,Pi ) are the positive, negative, and neutral product features. The count() is the number of occurrences of polarity kind. Parentof(Fjkparent\_node, Fjkchild\_node) is the feature hierarchy in the ontology. Objectproperty(nodea , nodeb ) is the fact about related product features. Strength(node, rel(int)) is the opinion strength of the feature which is present in the review. Depth of the node

,Pi ))

in the ontology and the height of the ontology are the ontology tree measures. The asterisk '\*' in the steps represents the multiplication operator.


Sentiment(F<sup>j</sup> ,Pi ) = count(Pos(F<sup>j</sup> ,Pi ))−count(Neg(F<sup>j</sup> ,Pi )) count(Pos(F<sup>j</sup> ,Pi )) + count(Neg(F<sup>j</sup> ,Pi )) + count(Neu(F<sup>j</sup>


```
if (parentof(Fjkparent_node, Fjkchild_node) == true)
 {
         if (Sentiment(Fjkparent_node, Pi
                                           ) < Sentiment(Fjkchild_node, Pi
                                                                           ))
           {
           Sentiment(Fjkparent_node, Pi
                                        ) = Sentiment(Fjkparent_node, Pi
                                                                          ) +
[Sentiment(Fjkchild_node, Pi
                             ) * depth of the Fjkchild_node];
        New_Sentiment(Fjkparent_node, Pi
                                             ) = Sentiment(Fjkparent_node, Pi
                                                                               );
        }
        if (Sentiment(Fjkchild_node, Pi
                                        ) == Sentiment(Fjkparent_node, Pi
                                                                           ))
                   Continue;
}
```

```
else if (objectproperty(nodea
                              ,nodeb
                                     ) ^ strength(node,rel(int)) == true)
```

```
if(Sentiment(Fjknodea, Pi
                          ) < = 0)
```

```
{
```
{

The framework is composed of main component. The improvement of sentiments of the product features using the knowledge mined from the PROO ontology for improved product recommendations is shown in a diagram as under. The first two modules, i.e., the development of PROO ontology and semantic data mining the PROO ontology, were already carried out by the researchers in their work in [17]. The proposed main component of improving the sentiments of the product features using the knowledge mined from the PROO ontology for better

The rule-based sentiments mined from the PROO ontology specify the relations between the parent feature and the child feature. It also reveals the relations among the related product features. The opinion strength of the feature for which the sentiment is to be determined by the machine also carries its importance in the rule. Also the sentiments calculated for each of the product features after extracting from the reviews are stored separately for further mapping. The detailed procedure for improving product recommendations is expressed in step-by-step form below. The symbols used in the steps are as follows: O is the PROO ontol-

) where j = 1,2,3,… The Pos(F<sup>j</sup>

are the positive, negative, and neutral product features. The count() is the number of occurrences of polarity kind. Parentof(Fjkparent\_node, Fjkchild\_node) is the feature hierarchy in the ontol-

rel(int)) is the opinion strength of the feature which is present in the review. Depth of the node

,Pi

) is the fact about related product features. Strength(node,

), Neg(F<sup>j</sup>

,Pi

of the product

,Pi )

), and Neu(F<sup>j</sup>

product recommendations is described below with the algorithm pseudo-code.

**Figure 2.** Model for improving the sentiments of the product features.

190 Machine Learning - Advanced Techniques and Emerging Applications

**ontology**

ogy. Pi

is represented as Sentiment(F<sup>j</sup>

ogy. Objectproperty(nodea

Pi

**3.1. Improving the product recommendations using rule-based sentiments from** 

is the product and i = 1,2,3,… The sentiment of the product feature F<sup>j</sup>

,Pi

, nodeb

Sentiment(Fjknodea, Pi ) = Sentiment(Fjknodea, Pi ) + height of the ontology/100; **/\*Since to have small change in the score\*/**

```
New_Sentiment(Fjknodea, Pi
                             ) = Sentiment(Fjknodea, Pi
                                                       );
```

```
}
if(Sentiment(Fjknodeb, Pi
                            ) < = 0)
{
```
Sentiment(Fjknodeb, Pi ) = Sentiment(Fjknodeb, Pi ) + height of the ontology/100; **/\*Since to have small change in the score\*/**

> New\_Sentiment(Fjknodeb, Pi ) = Sentiment(Fjknodeb, Pi );

**6.** Sort the products in the descending order based on the enhanced sentiments of the kcommon features.

The product 'Samsung Galaxy j7 prime' has one of the non-taxonomical features as RAM and performance respectively. The number of positive and negative mentions on the RAM is 6 and 5. There are no neutral mentions. The number of positive and negative mentions on the performance is 6 and 2. There are no neutral mentions. The sentiment scores obtained after calculation for RAM and performance are 0.09 and 0.1 respectively. The opinion strengths for RAM and performance obtained from review dataset are 2.5 and 2.5. By applying these features and opinion strength values as instances in the non-taxonomical sentiment rule, the semantic sentiment learned is positive. The sentiment scores of RAM and performance are now mapped to Positive sentiment label. The similar products are retrieved from the ontology by querying on the 'similarTo' object property for the corresponding instance values for the customer-searched product. Now for each k-common feature among all the retrieved products in ontology, whenever there exists any taxonomical constraints and when the sentiment of the parent feature node in the ontology is less than the sentiment of the child feature node then the sentiment of the parent feature node is updated by adding the weighted sentiment of the child feature node. The weight is the depth of the child feature node present in the ontology. This kind of analysis is possible as specified by [6], who say that the importance of the feature is determined by the depth of the feature in the ontology. This analysis views the taxonomical features 'as-a-unit.' Whenever the sentiment of the parent feature node is equal to the sentiment of the child feature node,

Sentiment-Based Semantic Rule Learning for Improved Product Recommendations

http://dx.doi.org/10.5772/intechopen.72514

193

Once all the taxonomical constraints are analyzed, the non-taxonomical constraints are also analyzed. The non-taxonomical constraints are analyzed to learn the related features and the contribution to their sentiment values. When the sentiments of the related nodes are less than or equal to zero, the sentiments of the related nodes are updated by adding the ratio. The ratio is 1/100th of the height of the ontology to make the score present in the sentiment range. The height of the ontology is added to the existing sentiment score as the related nodes are present

The product 'Samsung Galaxy j7 prime' has sentiment scores obtained after calculation for battery and battery life is 1 and 1 respectively. There is no update in the sentiment value for either of the features. This is because the sentiment values for parent feature (battery) and

The product 'Samsung Galaxy j7 prime' has sentiment scores obtained after calculation for screen and display is 1 and 0 respectively. There is an update in the sentiment value for feature 'display'. This is because the sentiment value of display is equal to zero. The updated sentiment value for the feature 'display' is 0.03. The product features screen and display fall

Finally, the products are sorted in the descending order of the enhanced sentiments. The

The description logic (DL) is used in reasoning the instances of ontology. DL is the math behind the constructs of the ontology. The engineered PROO ontology has DL expressivity

child feature (battery life) which fall under taxonomical constraints are equal.

sorted list is provided as the product recommendations to the customer.

**4. Design decisions in the implementation of ontology**

then no update is carried out on these nodes.

at any level in the ontology other than the root.

under non-taxonomical constraints.

**7.** Recommend products.

The explanation of the steps is as follows: given the product to be searched by the end user in the E-Commerce site, all the similar products are recommended. Initially, the algorithm retrieves all the similar products data from the ontology based with respect to the usersearched product. The common product features of retrieved products and the searched product are called as 'k-common features'. Next for each of the k-common features, the corresponding sentiment is calculated by using the number of positive mentions and number of negative mentions on the features. Whenever a neutral mention is identified, it is also counted and used in the sentiment calculation. Then the taxonomical and non-taxonomical sentiment rules on the product features are retrieved from the ontology. The target sentiment instances Positive and Negative are mapped to the minimum and maximum sentiment scores of the product features to create a sentiment range. Following discussions are the examples to clarify how the improved product recommendations are returned to the customer when a search for the product takes place. The dataset details for which the examples discussed were presented in **Table 1** which was presented in section V.

The product 'Samsung Galaxy j7 prime' has one of the taxonomical features as battery and battery life respectively. The number of positive mentions and negative mentions on the battery are 6 and 0. There are no neutral mentions. The number of positive and negative mentions on the battery life is 1 and 0. There are no neutral mentions. The sentiment scores obtained after calculation for battery and battery life are 1 and 1 respectively. The opinion strengths for battery and battery life obtained from review dataset are 3 and 3. By applying these features as instances in the taxonomical sentiment rule, the semantic sentiment learned is positive. The sentiment scores of battery and battery life are now mapped to Positive sentiment label.


**Table 1.** Reviews dataset details.

The product 'Samsung Galaxy j7 prime' has one of the non-taxonomical features as RAM and performance respectively. The number of positive and negative mentions on the RAM is 6 and 5. There are no neutral mentions. The number of positive and negative mentions on the performance is 6 and 2. There are no neutral mentions. The sentiment scores obtained after calculation for RAM and performance are 0.09 and 0.1 respectively. The opinion strengths for RAM and performance obtained from review dataset are 2.5 and 2.5. By applying these features and opinion strength values as instances in the non-taxonomical sentiment rule, the semantic sentiment learned is positive. The sentiment scores of RAM and performance are now mapped to Positive sentiment label.

}

{

ment label.

**Table 1.** Reviews dataset details.

if(Sentiment(Fjknodeb, Pi

common features. **7.** Recommend products.

Sentiment(Fjknodeb, Pi

New\_Sentiment(Fjknodeb, Pi

192 Machine Learning - Advanced Techniques and Emerging Applications

sented in **Table 1** which was presented in section V.

**Document attributes Values** Number of review documents 300 Minimum sentences per review 9 Maximum sentences per review 15

) < = 0)

ontology/100; **/\*Since to have small change in the score\*/**

) = Sentiment(Fjknodeb, Pi

) = Sentiment(Fjknodeb, Pi

**6.** Sort the products in the descending order based on the enhanced sentiments of the k-

The explanation of the steps is as follows: given the product to be searched by the end user in the E-Commerce site, all the similar products are recommended. Initially, the algorithm retrieves all the similar products data from the ontology based with respect to the usersearched product. The common product features of retrieved products and the searched product are called as 'k-common features'. Next for each of the k-common features, the corresponding sentiment is calculated by using the number of positive mentions and number of negative mentions on the features. Whenever a neutral mention is identified, it is also counted and used in the sentiment calculation. Then the taxonomical and non-taxonomical sentiment rules on the product features are retrieved from the ontology. The target sentiment instances Positive and Negative are mapped to the minimum and maximum sentiment scores of the product features to create a sentiment range. Following discussions are the examples to clarify how the improved product recommendations are returned to the customer when a search for the product takes place. The dataset details for which the examples discussed were pre-

The product 'Samsung Galaxy j7 prime' has one of the taxonomical features as battery and battery life respectively. The number of positive mentions and negative mentions on the battery are 6 and 0. There are no neutral mentions. The number of positive and negative mentions on the battery life is 1 and 0. There are no neutral mentions. The sentiment scores obtained after calculation for battery and battery life are 1 and 1 respectively. The opinion strengths for battery and battery life obtained from review dataset are 3 and 3. By applying these features as instances in the taxonomical sentiment rule, the semantic sentiment learned is positive. The sentiment scores of battery and battery life are now mapped to Positive senti-

) + height of the

);

The similar products are retrieved from the ontology by querying on the 'similarTo' object property for the corresponding instance values for the customer-searched product. Now for each k-common feature among all the retrieved products in ontology, whenever there exists any taxonomical constraints and when the sentiment of the parent feature node in the ontology is less than the sentiment of the child feature node then the sentiment of the parent feature node is updated by adding the weighted sentiment of the child feature node. The weight is the depth of the child feature node present in the ontology. This kind of analysis is possible as specified by [6], who say that the importance of the feature is determined by the depth of the feature in the ontology. This analysis views the taxonomical features 'as-a-unit.' Whenever the sentiment of the parent feature node is equal to the sentiment of the child feature node, then no update is carried out on these nodes.

Once all the taxonomical constraints are analyzed, the non-taxonomical constraints are also analyzed. The non-taxonomical constraints are analyzed to learn the related features and the contribution to their sentiment values. When the sentiments of the related nodes are less than or equal to zero, the sentiments of the related nodes are updated by adding the ratio. The ratio is 1/100th of the height of the ontology to make the score present in the sentiment range. The height of the ontology is added to the existing sentiment score as the related nodes are present at any level in the ontology other than the root.

The product 'Samsung Galaxy j7 prime' has sentiment scores obtained after calculation for battery and battery life is 1 and 1 respectively. There is no update in the sentiment value for either of the features. This is because the sentiment values for parent feature (battery) and child feature (battery life) which fall under taxonomical constraints are equal.

The product 'Samsung Galaxy j7 prime' has sentiment scores obtained after calculation for screen and display is 1 and 0 respectively. There is an update in the sentiment value for feature 'display'. This is because the sentiment value of display is equal to zero. The updated sentiment value for the feature 'display' is 0.03. The product features screen and display fall under non-taxonomical constraints.

Finally, the products are sorted in the descending order of the enhanced sentiments. The sorted list is provided as the product recommendations to the customer.
