Electrical Engineering Department, Iran University of Science & Technology (IUST), Narmak, 1684613114 Tehran, Iran

Research Institute of Petroleum Industry (RIPI), Shahre Rey, 1485733111 Tehran, Iran

Blind identification of MIMO FIR systems has widely received attentions in various fields of wireless data communications. Here, we use Particle Swarm Optimization (PSO) as the update mechanism of the well-known inverse filtering approach and we show its good performance compared to original method. Specially, the proposed method is shown to be more robust against lower SNR scenarios or in cases with smaller lengths of available data records. Also, a modified version of PSO is presented which further improves the robustness and preciseness of PSO algorithm. However the most important promise of the modified version is its drastically faster convergence compared to standard implementation of PSO.

1. Introduction

This paper addresses the problem of blind identification of multi-input-multi-output (MIMO) channels in the general scenario, where all of the

Here

Inverse filtering approach [1] consists of an iterative solution, which recursively extracts sources from the mixture one by one and then following the estimation of experienced channel by any extracted source, and reconstructs this signal as it was originally observed on each sensor. Now, after reduction of reconstructed sources from sensor measurements, the same procedure can be used for extraction of remaining sources. The source extraction step is down by the steepest descent maximization of a class of cumulant-based cost functions with respect to coefficients of equalizer filters. However, this optimization strategy is prone to being trapped in local maxima, especially in lower SNR scenarios or when the available data record is too small [1].

An alternative to this gradient-based optimization is a structured stochastic search of the objective function space. These types of global searches are structure independent because a gradient is not calculated and the adaptive filter structure does not directly influence the parameter updates. Due to this property these types of algorithms are potentially capable of globally optimizing any class of objective functions [2]. Particle Swarm Optimization (PSO) is one of these stochastic structured search algorithms that have recently gained popularity for optimization problems.

This paper investigates the application of PSO technique in source extraction step of the above-mentioned procedure for mutually independent, zero mean i.i.d binary sequences. It should be noted that although [3] has addressed the same subject, but its update equation seems more like a heuristic method which employs a random weighted addition of gradients based on second-order statistics and prediction methods, while PSO is defined as cooperative random search of particles toward their current global and local best points in search space according to some fitness function [4] as it will be discussed in Section 3.1 and simple multiplication of update equation by a random number (as in [3]) does not represent the essence of global random search suggested by PSO.

In this paper, after studying the suitability of standard PSO for blind identification problem, we propose a modified version which finds its initial direction according to original gradient-based method in [1]. In fact the promised random search of PSO will be limited to smaller local areas with the most probability of maximizing the cost function. Also we modify the original fitness function used in [1, 3] by two supportive performance indexes in order to reduce the probability of algorithm failures which are the result of complete ignorance of original objective function about the existence of additive noise.

2. Iterative Source Extraction and Channel Estimation

In this section, we provide a detailed description of inverse filtering approach. Note that here we neglect the presence of noise

Here

Based on third- and fourth- order cumulant of equalizer output, [1] introduced a set of objective functions

Here

Here,

So the maximization of

This separation criteria, however, has the weakness of complete ignorance about the presence of noise. Authors in [1] have developed their theoretical proof about suitability of the above-mentioned objective function based on the assumption that the noise

If we consider the presence of noise

However it is obvious that (6) would still provide local maximum for

After extraction of each source, the estimated signal can be used for estimation of the experienced channel by that signal through its way to each sensor:

Now, the third step is to reconstruct the extracted source as it was originally observed on each sensor in order to suppress its contribution from sensor observations:

Hereafter, the same procedure can be used for extraction of remaining sources and then estimating the SIMO channel experienced by any one of them.

3. Particle Swarm Optimization

3.1. PSO Principles

Particle swarm optimization [2] is a stochastic, population-based evolutionary algorithm for problem solving in complex multidimensional parameter spaces. It is a kind of swarm intelligence that is based on social-psychological principles.

A multidimensional optimization problem is given, along with an objective function to evaluate the fitness of each candidate point in parameter space; the swarm is typically modeled by particles in this multidimensional space that have a position and a velocity. After the definition of a random population of individuals (particles) as candidate solutions, they fly through hyperspace of parameters with the aid of two essential reasoning capabilities: their memory of their own best local position and knowledge of the global or their neighborhood's best [5].

PSO begins by initializing a random swarm of

Here,

In fact, the trajectory of each particle is determined by random superposition of its previous velocity with the location of local and global best particles found by far. As new

3.2. Implementation of PSO for Source Extraction

We propose to use PSO as optimization method for maximization of

Clearly

The simplicity of PSO suggests that we can easily combine several objective functions for further evaluation of the true level of fitness of a candidate particle in order to stay aside from the trap of fake global maxima in the case of noisy and smaller length records of data. For instance, since we expect complete deconvolution of extracted source at each iteration, a simple choice is to evaluate the level of correlation between successive samples of equalizer output. It is clear that there will be no time dependency between successive samples at ideal point of separation and deconvolution as in (5). It allows us to exploit signal correlations at nonzero lags as in [7]. Specifically we expect the following cost function to converge to zero when the true separation met.

Also we can use histogram of equalizer output as a clue for leading particles toward desired solution. For the assumed equiprobable sequence of

Histogram of successfully extracted source at equalizer's output.

**Histogram of successfully extracted source at equalizer's output.**

Irregular histogram of equalizer's output when algorithm fails.

**Irregular histogram of equalizer's output when algorithm fails.**

Then as another measure of fitness, we can use the frequency of occurrence of samples with very small absolute values (smaller than 0.2 e.g.,) or with values larger than 2 as fitness parameter

Here

As it will be shown in Section 4, although this combination of three different cost functions improves the performance of PSO algorithm in noisy scenarios, it also has restricted the approval of new global and local best particles to some extent and it may slow down the search for optimum vector of parameters even in high SNR cases. So there will be a tradeoff between convergence speed and probability of algorithm failures in noisy environments.

In Practice, one general weakness of PSO is that its local search is not guaranteed convergent and based on the selected swarm size and its initialization; it may converge in a local non-optimal solution. Also, the relatively slower rate of convergence of combined fitness function (13) requires larger swarm sizes and more iterations of algorithm. Hence, some modifications in original position update (11) have introduced using a similar approach suggested in [2]. We modify (11) as follos hybrid update equation:

Here,

From now on, we will refer to this modified version of PSO as hybrid PSO.

4. Simulation Results

In this section we compare performance of the proposed hybrid PSO of (14) with the steepest gradient method of [1] for the source extraction step of the well-known inverse filtering approach. Since the comparison with [3] was impossible due to its abstruse presentation of update equations we compare our proposed modified version with standard implementation of PSO (as in (11)) in terms of computational complexity, fidelity, and robustness against noise. We chose

Inputs

In the first simulation, the equalizer length was chosen to be 15

The performance index used for evaluation of

In which

The parameters for PSO and hybrid PSO algorithms were chosen according to the tradeoff between convergence speed and algorithm runtime. A population of 100 particles is used with the maximum of 400 allowable iterations for PSO and 60 iterations for hybrid PSO.

Figure

Comparison of NMSE for three algorithms in different SNRs.

**Comparison of NMSE for three algorithms in different SNRs.**

As it was mentioned previously, the steepest gradient approach has difficulties coping with smaller lengths of available data records. In order to evaluate our proposed algorithm under such conditions, another simulation was done with different number of available observed samples. 30 Monte Carlo simulations were run at

Comparison of NMSE of channel impulse response Estimations for different lengths of available data records(

**Comparison of NMSE of channel impulse response Estimations for different lengths of available data records(**

The effect of parameters in (13) in identification quality.

**The effect of parameters in (13) in identification quality.**

It is interesting to note the stability of PSO method, specially the hybrid PSO. If we define the NMSEs of larger than

Standard deviation of NMSE of channel impulse response estimations for three methods (

**Algorithm**

**Steepest gradient**

**Standard PSO**

**Hybrid PSO**

Mean (dB)

−16.8

−17.9

−18.7

Std (dB)

−3.43

−2.1

−0.56

Std/Mean

20.2%

11.9%

2.8%

Best (dB)

−19.41

−19.2

−19.33

Worst (dB)

−8.96

−12.1

−14.23

Number of complete failures

7

5

0

In final comparison of PSO and proposed hybrid PSO, the speed of convergence and computational complexity of these two must be studied. Simulation results show that for hybrid PSO, usually a small population of less than 30 particles is enough to very fine convergence of the swarm to the global optimal point; however it takes more than hundreds of iterations for even large populations of more than 100 particles for standard PSO. So, the very fast convergence and the preciseness of optimization are two most important promises of hybrid PSO.

In order to study the effect of

Finally, it should be mentioned that our implementation of both PSO and hybrid PSO was the simplest possible approach in order to keep computational complexity and algorithm run time as close as possible to steepest gradient approach. However, it is always possible to further improve PSO algorithms by employing larger population of swarms and more number of allowable iterations with the cost of more computational complexity. Also the performance of any PSO algorithm can be further improved by strategically selecting the starting positions of the particles [8].

5. Conclusion

Two different realizations of Particle Swarm Optimization for source extraction step of well known inverse filtering MIMO identification approach were studied. They both show satisfying results as the original steepest descent method in noise-free scenarios. However, they achieved moderate improvement in lower SNRs and smaller data lengths.

Also the Hybrid PSO algorithm exhibited significant improvement in convergence speed compared to standard PSO, while the initial population of particles was kept smaller. This was the main advantage of hybrid PSO over standard PSO beside the fact that Hybrid PSO was the most precise method with the least probability of complete failure.

Acknowledgment

We gratefully acknowledge partial financial support of this research by the French