Author details

individual optimization problem is a mix integer program for each node. Lagrangian relaxation is a heuristic method that stands for solving mix integer problems with decomposing constraints. The idea of the method is to decompose constraints which complicated the problem by adding them to the objective function with the associated vector μ called the Lagrange multiplier. After applying it to our problem, we derive the dual problem of Eq. (9) for an efficient solution by

j¼1

8 >><

>>:

<sup>þ</sup> w3Hij <sup>þ</sup> w4Wj

After obtaining above optimization problem, it could be separable in the variables xi and it also decomposes into sub-problems for each node i. Thus each node needs to solve the onedimensional optimization problem Eq. (11). This optimization problem consists of its own utility function and Lagrangian multipliers which are available for node i. Generally, the subgradient method is used to solve the obtained dual problem because of simplicity of computations per iteration. First-order methods such as subgradient have slower convergence rate for high accuracy but they are very effective in large-scale multi-agent optimization

When one optimal approximate solution is found, RS data should be divided into partitions and proportionally distributed to found nodes in the solution according to performance heu-

As a result of technological developments, the amount of data produced by many organizations on a daily basis has increased to terabyte levels. Remotely sensed data, which is spatially and spectrally amplified and heterogeneous by means of different sensing techniques, causes great difficulties in storing, transferring, and analyzing with conventional methods. It has become a necessity to implement distributed approaches instead of conventional methods that are inadequate in critical applications when real/near-real-time analysis of relevant big data is needed. Existing distributed file systems, databases, and high-performance computing systems are experiencing difficulties in

xjA<sup>j</sup> MEM

Hij ≤ Hopmax Bij ≤ Cmax

xi,<sup>j</sup> ∈f g 0; 1 ∀i, j

� � � �

CPU <sup>þ</sup> w5Wj

1

<sup>A</sup> <sup>þ</sup> <sup>λ</sup><sup>j</sup> Isize �X<sup>n</sup>

STR <sup>þ</sup> w6Wj

0 @

j¼1

MEM

xjA<sup>j</sup> STR 1 A

(11)

xjUij <sup>þ</sup> <sup>μ</sup><sup>j</sup> Isize �X<sup>n</sup>

subject to

1 Bij

problems where the aim is to find near-optimal approximate solutions [55].

0 @

adding complicated constraints 1 and 2.

86 Data Mining

min L xð Þ¼ ; <sup>μ</sup>; <sup>λ</sup> <sup>X</sup><sup>n</sup>

where utilization function for node:

Uij ¼ w1Lij þ w2

and μ<sup>j</sup> ≥ 0∀j and λ<sup>j</sup> ≥ 0∀j are the dual variables.

ristic (Eq. (1)) which is given in Section 6.1.

5. Conclusion

j¼1

Mustafa Kemal Pektürk<sup>1</sup> \* and Muhammet Ünal<sup>2</sup>

