You currently do not have any folders to save your paper to! Create a new folder below.

Abstract

Image sensor-based visible light positioning can be applied not only to indoor environments but also to outdoor environments. To determine the performance bounds of the positioning accuracy from the view of statistical optimization for an outdoor image sensor-based visible light positioning system, we analyze and derive the maximum likelihood estimation and corresponding Cramér–Rao lower bounds of vehicle position, under the condition that the observation values of the light-emitting diode (LED) imaging points are affected by white Gaussian noise. For typical parameters of an LED traffic light and in-vehicle camera image sensor, simulation results show that accurate estimates are available, with positioning error generally less than 0.1 m at a communication distance of 30 m between the LED array transmitter and the camera receiver. With the communication distance being constant, the positioning accuracy depends on the number of LEDs used, the focal length of the lens, the pixel size, and the frame rate of the camera receiver.

1.

Introduction

In recent years, with the rapid development of solid-state lighting technology, white light-emitting diodes (LEDs) have been widely used in the field of lighting, displaying, and transmitting and/or receiving of data.1,2 Compared with incandescent and fluorescent lights, white LEDs have the characteristic of long life expectancy, high energy efficiency, and low cost, and they can be modulated at a relatively high speed that is undetectable to the human eye. To date, a considerable amount of white LED–based research has been developed and may fall into two categories: visible light communication (VLC) and visible light positioning (VLP).

For VLC, the intrinsic features of white LEDs makes them suitable for high speed communication. First, VLC is based on the lighting function of white LEDs. To ensure sufficient light intensity, 400 to 1000 lux3 is often required for illumination levels. Therefore, the signal-to-noise ratio is high enough for VLC. Second, the radiation spectrum of white LEDs spans from 400 to 800 THz; thus high channel capacity could be achievable in accordance with the Shannon formula. At present, most researches on high speed VLC are confined to the indoor environment, mainly to improve the modulation bandwidth of LEDs,4 develop improved modulation technology,5 and design multiplexing scheme.6 The highest data rate reported so far is the wave division multiplexing (WDM) VLC system,7 where carrierless amplitude and phase modulation technology and adaptive equalization technology are jointly used to achieve a data rate of 4.5 Gbps in the laboratory.

For VLP,89.–10 according to the optical reception devices used at the receiver, it can be divided into the photodiode (PD)-based VLP and the image sensor (IS)-based VLP. Since the PD is susceptible to the direction of the light beam, if it is flipped over or moved out of the range covered by the LED, the PD-based VLP system could fail; thus it has limited mobility and is only suitable for slow speed motion or quasistatic condition. In addition, PDs cannot be utilized in outdoor direct solar radiation environment. This is because PDs can only detect the optical power of incoming light; because direct solar radiation is usually strong, the PD is saturated by the intense optical power since it has a limited response. Therefore, most of the PD-based VLPs proposed belong to indoor positioning.1112.13.14.15.16.–17

For the IS-based VLP,9,1819.–20 IS is used as an optical reception device. IS can detect not only the intensity but also the angle of arrival (AOA) of incoming light. IS consists of many pixels, and different light sources can be spatially separated using their imaging points on IS. Here, the light sources include various LED sources (such as indoor LED dome light, outdoor LED traffic light, LED brake light, or headlight of a vehicle) and noise sources (such as the Sun and other ambient lights). Via differentiating the imaging positions of light sources, LED sources can be recognized by a simple feature matching algorithm from multiple noise sources. Consequently, the IS-based VLP is available not only for indoor but also outdoor environments. Furthermore, combined with image processing and digital signal processing technology, the IS-based VLP can be utilized for safety driving such as collision warning and avoidance, lane change assistance, pedestrian detection, and adaptive cruise control.

To date, the published papers on the IS-based VLP have mostly focused on the field of applied research, such as indoor navigation systems,21 outdoor intelligent traffic systems,2223.24.–25 and various location-based services.26 These researches have shown that accurate localization can be achievable; however, little has been published about the analytical performance bounds of the positioning accuracy from the view of statistical optimization. The determination of the positioning accuracy will allow the optimization of the parameters governing the IS-based VLP systems.

The contributions of this paper are as follows. First, we analyze and derive the maximum likelihood estimation (MLE) and corresponding Cramér–Rao lower bounds (CRLB) for a typical outdoor IS-based VLP system, assuming white Gaussian model for system noise. Second, we analyze the effect of system parameters on CRLB. When a camera IS is used as receiver, there exist several types of noise generated from IS. As shot noise takes the dominant role, the system noise variance is influenced by many factors, such as the total received optical power, the pixel size, the focal length, and the frame rate of the camera receiver. Because the derived CRLB is proportional to system noise variance, we will emphatically analyze the system noise and the parameters affecting system noise variance in this paper.

The rest of this paper is organized as follows. In Sec. 2, an outdoor IS-based VLP system model is introduced, where the transmitter is the LED array of traffic light, and the camera receiver is assumed to be mounted on the dashboard of a vehicle. In Sec. 3, the MLE of the vehicle position is derived under the condition that the observation values of the LEDs’ imaging points are affected by white Gaussian noise. The performance analysis is completed in Sec. 4, where the CRLB is deduced and the parameters affecting CRLB are analyzed in detail. In Sec. 5, simulation results are given for a typical outdoor scenario. Conclusions are made in Sec. 6.

Notations: The operators {·}T, E[·], and var(·) denote the transpose of a matrix, the expectation, and the variance of a random variable or matrix, respectively.

2.

System Model

For the outdoor IS-based VLP system, as shown in Fig. 1, the transmitter may be the LED array of the traffic light in a city crossing, and the receiver may be a camera IS mounted on the dashboard of a vehicle. The signal from the LED and the image of the LED on the camera receiver are jointly used to determine the location of the receiver, which is assumed to be the vehicle position.

Fig. 1

An outdoor IS-based VLP system model, where any LED Pi from the LED array transmitter is imaged through the pinhole Os onto an imaging point pi in the image plane of the camera receiver fixed on a vehicle, with Pi=(Xi,Yi,Zi)T, i=1,2,…,N being the 3-D world coordinate; pi=(xi,yi)T, i=1,2,…,N being the 2-D image plane coordinate; and Os=(Xs,Ys,Zs)T being the 3-D world coordinate of the center of the camera receiver. The goal is to estimate the vehicle position, which is assumed to be the world coordinates Os of the center of the camera receiver under the condition that the system is affected by white Gaussian noise. The system parameters are listed in Table 1.

Table 1

System parameters.

Parameter

Value

Height of traffic light

Zi=6.21 (m)

Height of camera receiver

Zs=1.0 (m)

Length of traffic arm

1.0 (m)

Width of lane

3.5 (m)

Width of vehicle

1.8 (m)

In Fig. 1, there are three coordinate systems, which are the three-dimensional (3-D) world coordinate system, the 3-D camera coordinate system, and the two-dimensional (2-D) image plane coordinate system. Any LED Pi (i=1,2,…,N) from the LED array transmitter is imaged into an imaging point pi (i=1,2,…,N) in the image plane through the center of the lens. It is assumed that LED Pi (i=1,2,…,N) is located at Pi=(Xi,Yi,Zi)T in the 3-D world coordinate system and is known a priori. The imaging point of LED Pi is pi=(xi,yi)T (i=1,2,…,N) in the 2-D image plane coordinate system, which can be measured via image processing and signal processing technologies. However, the measurement value of the imaging point is often influenced by noise. When shot noise is the dominant noise source, system noise can be viewed as white Gaussian noise.27,28 Hence, our goal is to estimate the location of the camera receiver for white Gaussian noise, to obtain the MLE, and finally derive the CRLB.

For any LED Pi from the LED array transmitter, it satisfies

(1)

Pi=RPci+Os,

where Pci=(Xci,Yci,Zci)T, i=1,2,…,N is the coordinate of LED Pi in the camera coordinate system. The camera coordinate system is such a system where its origin is located at the center of the camera receiver, the direction of Zc is perpendicular to the 2-D image plane, and the Zc axis is usually called the optical axis. R is the rotation matrix of the camera receiver from the camera coordinate system to the world coordinate system, which is a 3×3 orthogonal matrix. Os=(Xs,Ys,Zs)T is the world coordinate of the center of the camera receiver, which is assumed to be the vehicle position since the camera receiver is fixed on a vehicle, supposably on the dashboard of a vehicle.

The rotating process of the camera receiver from the camera coordinate system to the world coordinate system is shown in Fig. 2. The rotation angle ϕ and ω can be directly read out from the inclination sensor attached in the camera receiver; however, the azimuth angle κ has to be calculated:

(2)

For simplicity in this paper we assume that the vehicle is running on a reasonably flat terrain plane without azimuth rotation; that is to say, the orientation of the camera coordinate system is the same as that of the world coordinate system so that the rotation matrix from the camera coordinate system to the world coordinate system can be expressed as R=E, where E denotes an identity matrix. In addition, since the camera receiver is fixed on the dashboard of a vehicle, the height Zs of the camera receiver is known a priori, then the distance (between the traffic light and camera receiver) along the direction of the optical axis is h=Zi−Zs.

Fig. 2

Rotating process of the camera receiver from the camera coordinate system to the world coordinate system.

In the 3-D camera coordinate system, the relationship between Pci=(Xci,Yci,Zci)T, i=1,2,…,N and pi=(xi,yi)T, i=1,2,…,N can be described, with the focal length of the lens being f, as

(3)

Xcixi=Yciyi=Zci−f.

Rearranging Eqs. (1) and (3), we get the mathematical relationship between the LEDs (Xi,Yi)i=1N and the measurement values (x˜i,y˜i)i=1N of their imaging points, which can be written as

(4)

(x˜iy˜i)=(−fh)(Xi−XsYi−Ys)+(nxinyi),

where the measurement noise {nxi}i=1N and {nyi}i=1N are independently white Gaussian noise of the direction x and y in the 2-D IS plane, with the same mean 0 and variance σ2.

Our goal is to estimate the parameter vector r=(Xs,Ys)T of the vehicle position, derive its MLE, and finally get the CRLB for white Gaussian noise.

3.

Maximum Likelihood Estimation

Based on the measurement values {x˜i}i=1N and {y˜i}i=1N and the LED coordinates {Xi}i=1N and {Xi}i=1N, the log-likelihood function of the parameter vector r=(Xs,Ys)T of the vehicle position is given as

(5)

(6)

Let ∂ln(Xs,Ys)/∂Xs=0; then the MLE of the position parameter Xs is given as

(7)

X^s=1N∑i=1NXi+hNf∑i=1Nx˜i.

Similarly, differentiating the log-likelihood function with respect to Ys gives

(8)

∂ln(Xs,Ys)∂Ys=fσ2h∑i=1N[(fh)[Yi−Ys]+y˜i].

Let ∂ln(Xs,Ys)/∂Ys=0; then the MLE of the position parameter Ys is given as

(9)

Y^s=1N∑i=1NYi+hNf∑i=1Ny˜i.

Define 1N∑i=1NXi=X¯i,1N∑i=1NYi=Y¯i,1N∑i=1Nx˜i=x¯,1N∑i=1Ny˜i=y¯.

We can express the MLE of the vehicle position as

(10)

{X^s=X¯+hfx¯Y^s=Y¯+hfy¯.

Consequently, the MLE of the vehicle position can be obtained by finding the means of measurement values {x˜i}i=1N and {y˜i}i=1N, and the means of LEDs coordinates {Xi}i=1N and {Xi}i=1N.

Figure 3 shows the estimation values of Xs and Ys when σ is 10−3. It can be seen that the estimation values vibrate around the original value (Xs=2.02m and Ys=30.8m), and this is because the program is run independently each time.

Fig. 3

Estimation of Xs and Ys when σ=10−3.

4.

Performance Analysis

The CRLB gives a lower bound on variance attainable by any unbiased estimation. In order to better illustrate the performance of an estimation method, it can be compared with the CRLB. The regularity condition of the CRLB29 holds for the given estimation since Eqs. (6) and (8) are finite, and the expected value of Eqs. (6) and (8) is 0.

The CRLB of the vector parameter r=(Xs,Ys)T can be obtained through three steps. First, from Eqs. (6) and (8) we get the second-order derivatives of the log-likelihood function with respect to Xs and Ys, respectively. Second, taking the negative expectations of the second-order derivatives yields

(12)

Finally, the CRLB of the vector parameter r=(Xs,Ys)T can be derived by taking the [i,i]’th element of the inverse of I(r), namely, i=1,2, from Ref. 29. The inverse of the 2×2 Fisher information matrix r=(Xs,Ys)T is expressed as

(13)

I−1(r)=(h2σ2Nf200h2σ2Nf2).

Consequently, the CRLB of the vehicle position for white Gaussian noise is given as

(14)

{var(X^s)≥h2σ2Nf2var(Y^s)≥h2σ2Nf2.

Figure 4 shows the performance comparison of CRLB and mean square positioning error (MSPE) at σ2∈[−20,60]dB. MSPE is defined as E[(X^s−Xs)2+(Y^s−Ys)2]. Note that the decibel scale is employed in both axes in order to facilitate the presentation.30 It can be seen that the MSPE is proportional to σ2 and is unlimitedly close to the CRLB.

Fig. 4

Comparison of CRLB and MSPE.

From Eq. (14), we know that the CRLB is proportional to the noise variance σ2, with the number of LEDs used N, the focal length of the camera receiver f, and the distance h being known. However, when a camera IS is used as receiver for an outdoor IS-based VLP system, there exist several types of noise generated from IS. When shot noise takes the dominant role, the system noise variance is influenced by many factors, such as the total received optical power, the pixel size, the focal length, and the frame rate of camera receiver.

In the following, we will emphatically analyze the system noise in the IS-based VLP system and the parameters affecting system noise variance.

4.1.

System Noise

There are two basic types of noise generated by IS, which are pattern noise (PN) and random noise (RN). PN can be directly observed by human eyes and distributed in a spatial form, which does not vary with each frame of image. The effect of PN on image quality is far greater than RN, but it can be effectively inhibited or eliminated through the correlated double sampling or flat field correction technology. Hence, the effect of PN will not be considered in this paper.

The quantized values of RN vary with each frame of image, and RN obeys a statistical distribution. One typical RN is shot noise, and it is generated by random variation of photoinduced charge carriers with incoming light in the semiconductor of the camera receiver. When the number of photoinduced charge carriers is large enough, shot noise is in Gaussian distribution and is white noise. In the IS-based VLP system, shot noise is mainly made up of three parts: quantum noise generated from the observation point of the image corresponding to each LED, quantum noise coming from the interference of other LEDs, ambient light noise from fluorescent or incandescent lights or the sun, and so on. Since IS has the ability to spatially separate sources, the imaging points of discrete LEDs on the camera IS receiver can be resolved; that is, noise from the interference of other LEDs is so small that it can be classified into ambient light noise. Hence, while shot noise takes the dominant role, the system noise variance can be expressed as

(15)

σ2=σshot2=2qρ(Pr+PnAtotal)·I2Rb,

where q is the electronic charge, ρ is the conversion coefficient from the optical to electrical domain and is often assumed to be that 0.4mA/mW, Pr is the total received optical power of the camera receiver, Pn is the power of ambient light noise on unit area, Atotal is the total detecting area of IS, A is the effective detecting area corresponding to a single LED, I2 is the noise bandwidth factor, and often, I2=0.562, and Rb is the data transmission rate. Because the frame rate is equal to the sampling rate of the camera, if Nyquist sampling is used, the frame rate of the camera should be at least twice the data transmission rate.

4.2.

Parameters Affecting System Noise Variance

In this paper, such a channel scenario is utilized for the outdoor IS-based VLP system, as shown in Fig. 5(a), where the LED array transmitter is placed on horizontal ground and the camera receiver is fixed on the dashboard of a vehicle, with the center of the LED array transmitter in the optical axis of the camera receiver.

Fig. 5

4.2.1.

Total received optical power

If N LEDs are used to locate an IS receiver, the total received optical power Pr of IS is Pr=∑i=1NHi(0)Pt, when each LED transmits constant optical power Pt for each line of sight (LOS) channel. A lateral view of the transmitter–receiver channel is shown in Fig. 5(b). For the i’th channel, i=1,2,…,N, Hi(0)=(m+1)cosm(ϕi)cos(φi)A2πDi2 is the directed circuit gain, m is the order of Lambertian emission and generally m=1, ϕi is angle of irradiance, φi is the angle of incidence with 0≤φi≤φC, and φC is the field of view (FOV) of the IS receiver. Di is the propagation distance from each LED transmitter to the camera receiver. For the communication distance between the LED transmitter and camera receiver being h, if ϕi=φi then cos(ϕi)=h/Di, and the total received optical power can be expressed as

(16)

Pr=C(φ)·Pt·h−2·A,

where C(φ)=∑i=1N[(m+1)cosm+3(φi)/2π], which is related to incidence angles. It is assumed that all incidence angles of LOS links are within the FOV of the receiver.

4.2.2.

Effective area for detecting

It is necessary to calculate the image size corresponding to one LED when a camera IS is used for receiver. The imaging process of a single LED through the lens on the camera IS is shown in Fig. 6. According to Newton’s formula, if the diameters of the LED and the corresponding image are L and l, respectively, for a focal length of f and a distance of h between the LED and the lens, then the relationship between these parameters satisfies l=fL/h. It is referred to the distance where the LED generates an image that falls into exactly one pixel as the critical distance dc. If h≥dc, the image of the LED falls into only one pixel; then the effective area for detecting is A=w2, where w is the width of a pixel. If h<dc, the image of LED will fall into several pixels; then A=l2=(fL/h)2.

Fig. 6

Imaging process of a single LED on the camera IS.

5.

Numerical Results

Simulation experiments are performed in a channel scenario, as shown in Fig. 5(a), where the LED array transmitter is placed on horizontal ground and the camera receiver is fixed on the dashboard of a vehicle, with the center of the LED array transmitter in the optical axis of the camera receiver. The communication distance between the LED array transmitter and the camera receiver is changed from 15 to 60 m, every 5 m on a static condition. The white LEDs are used for the LED array transmitter, and a Photron IDP-Express R2000 is used for the camera IS receiver. The parameters are listed in Tables 2 and 3, respectively.

Table 2

Parameters for the LED array transmitter.

Parameter

Value

Number of LEDs

32×32

Spacing of LEDs

15 (mm)

Size of LED array

465mm×465mm

Diameter of an LED

L=6 (mm)

Transmitted optical power of an LED

Pt=100 (mW)

Table 3

Parameters for the camera image sensor receiver.

Parameter

Value

Focal length

f=35 (mm)

Pixel size

w=10 (μm)

Frame rate

Rs=1000 (fps)

Resolution

1024pixel×1024pixel

In the following, we will present simulation results for the CRLB for the positioning system described in the previous section for a range of parameters, such as the communication distance, the pixel size, and the focal length and frame rate of the camera receiver.

First, we study the influence of the communication distance on CRLB. Figure 7 shows the CRLB versus the communication distance between the LED transmitter and camera receiver, from 15 to 60 m, with a step size of 15 m on a static condition. The positioning accuracy decreases with increasing communication distance. When the communication distance between the LED array transmitter and the camera receiver is 60 m, the CRLB of the vehicle position is about 0.35 m. However, when the distance is shortened to 15 m, the CRLB of the vehicle position is less than 0.05 m.

Fig. 7

Influence of communication distance on CRLB with the distance between the LED array transmitter and the camera receiver from 15 to 60 m, every 15 m on a static condition. The camera receiver has a focal length of 35 mm, a pixel width of 10μm, and a frame rate of 1000 fps.

Second, we study the influence of pixel width on CRLB. Figure 8 plots the CRLB versus the number of LEDs, which shows that the positioning error decreases as the number of LEDs increases. We vary pixel width from 25 to 10μm. The CRLB drops with decreasing the pixel width. When four LEDs are used in the outdoor IS-based VLP system at a communication distance of 30 m between the LED array transmitter and the camera receiver, the CRLB of the vehicle position is less than 0.1 m.

Fig. 8

Influence of pixel width on CRLB with the pixel width from 25 to 10μm for a step size of 5μm. The communication distance between the LED array transmitter and the camera receiver is 30 m, and the camera receiver has a focal length of 35 mm and a frame rate of 1000 fps.

Next, we study the impact of focal length on CRLB. In Fig. 9, the CRLB is plotted as a function of the used number of LEDs. The CRLB falls with increasing focal length. We vary focal length from 20 to 35 mm. This figure again shows that low values of CRLB are achievable for typical camera IS parameters. For four LEDs used in the outdoor IS-based VLP system at a communication distance of 30 m between the LED array transmitter and the camera receiver, the CRLB of the vehicle position is less than 0.1 m.

Fig. 9

Influence of focal length on CRLB with the focal length from 20 to 35 mm for a step size of 5 mm. The communication distance between the LED array transmitter and the camera receiver is 30 m, and the camera receiver has a pixel width of 10μm and a frame rate of 1000 fps.

Finally, we investigate how CRLB behaves as we vary the frame rate of the camera receiver. In Fig. 10, the CRLB is plotted versus the number of LEDs for various frame rates. It shows that for a given number of LEDs, the CRLB drops with reducing frame rate. For four LEDs used in the outdoor IS-based VLP system at a communication distance of 30 m between the LED array transmitter and the camera receiver, the CRLB of the position of camera receiver for the frame rate of 1000 fps is about 0.5 m. This falls to only about 0.05 m when the frame rate is decreased to 30 fps. Therefore, the positioning accuracy increases with reducing of the frame rate, However, the lower frame rate (which is equal to the sampling rate of the camera IS) directly limits the achievable data rate. This is the reason why high speed cameras are usually utilized for VLC, while medium and low speed cameras are used for VLP.

Fig. 10

Influence of frame rate on CRLB with the frame rate from 30 to 1000 fps. The communication distance between the LED array transmitter and the camera receiver is 30 m, and the camera receiver has a focal length of 35 mm and a pixel width of 10μm.

6.

Conclusion

For a typical outdoor scenario, theoretical limits of the location of an in-vehicle camera receiver are calculated by deriving the CRLB. Under the condition that the observation values of the LED imaging points are affected by white Gaussian noise, the MLE for the vehicle position is first calculated, then the CRLB is derived. For typical parameters of a white LED array and in-vehicle camera IS, simulation results show that accurate location estimation is achievable, with the positioning error usually in the order of centimeters for a communication distance of 30 m between the LED array transmitter and the camera receiver. Positioning accuracy has relation with the number of LEDs used, the focal length of the lens, and the pixel size and frame rate of the camera receiver in the presence of a constant communication distance. The determination of the CRLB will provide a theoretical basis of statistical analysis for the optimization problem for outdoor IS-based VLP systems.

Acknowledgments

This work was supported by the Natural Science Foundation of China under Grant Nos. 61261017, 61362006, and 61371107, the Natural Science Foundation of Guangxi under Grant Nos. 2014GXNSFAA118387 and 2013GXNSFAA019334, the Key Laboratory Foundation of Guangxi Broadband Wireless Communication and Signal Processing under Grant No. GXKL061501, the Guangxi Colleges and Universities Key Laboratory Foundation of Intelligent Processing of Computer Images and Graphics under Grant No. GIIP201407, and the High-Level Innovation Team of New Technology on Wireless Communication in Guangxi Higher Education Institutions.

Biography

Xiang Zhao received her BS degree in information engineering and her MS degree in communication and information systems from Guilin University of Electronic Technology, Guilin, China, in 2001 and 2006, respectively. She is currently working toward her PhD in communication and information systems from Xidian University, Xian, China, and her current research interests are visible light communication and visible light positioning.

Jiming Lin received his MSc degree from the University of Electronic Science and Technology of China in 1995 and his PhD from Nanjing University in 2002. Then he held a two-year postdoctoral fellowship at the State Key Laboratory for Novel Software Technology at Nanjing University. Since 2004, he has been a professor of Guilin University of Electronic Technology. His research interests are in synchronization and localization in WSNs, UWB communication, and visible light communication and positioning.

Keywords/Phrases

Keywords

in

Remove

in

Remove

in

Remove

+ Add another field

Search In:

Proceedings

Volume

Journals +

Volume

Issue

Page

Journal of Applied Remote SensingJournal of Astronomical Telescopes Instruments and SystemsJournal of Biomedical OpticsJournal of Electronic ImagingJournal of Medical ImagingJournal of Micro/Nanolithography, MEMS, and MOEMSJournal of NanophotonicsJournal of Photonics for EnergyNeurophotonicsOptical EngineeringSPIE Reviews