Game theory and information systems The internal consistency and mathematical foundations of game theory make it a prime. This is most naturally applied to situations arising in political science or international relations, where concepts like power are most important. Science and Information Theory 2nd Edition by Leon Brillouin (Author) 3.6 out of 5 stars 3 customer reviews.
Subjects
Abstract
Optical fibre sensors based on Brillouin scattering have been vigorously studied in the context of structural health monitoring on account of their capacity for distributed strain and temperature measurements. However, real-time distributed strain measurement has been achieved only for two-end-access systems; such systems reduce the degree of freedom in embedding the sensors into structures, and furthermore render the measurement no longer feasible when extremely high loss or breakage occurs at a point along the sensing fibre. Here, we demonstrate real-time distributed measurement with an intrinsically one-end-access reflectometry configuration by using a correlation-domain technique. In this method, the Brillouin gain spectrum is obtained at high speed using a voltage-controlled oscillator, and the Brillouin frequency shift is converted into a phase delay of a synchronous sinusoidal waveform; the phase delay is subsequently converted into a voltage, which can be directly measured. When a single-point measurement is performed at an arbitrary position, a strain sampling rate of up to 100 kHz is experimentally verified by detecting locally applied dynamic strain at 1 kHz. When distributed measurements are performed at 100 points with 10 times averaging, a repetition rate of 100 Hz is verified by tracking a mechanical wave propagating along the fibre. Some drawbacks of this ultrahigh-speed configuration, including the reduced measurement accuracy, lowered spatial resolution and limited strain dynamic range, are also discussed.
Introduction
Brillouin scattering1 is regarded as one of the most promising principles of fibre-optic sensing in terms of the feasibility it affords for measurements of strain and temperature distribution. The Brillouin-based distributed sensors developed so far can be classified into two types: ‘analysis’ in which two light beams are injected into both ends of a fibre under test (FUT) and ‘reflectometry’ in which a light beam is injected into only one end of an FUT. The former category includes Brillouin optical time-domain analysis (BOTDA)2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, Brillouin optical frequency-domain analysis (BOFDA)13, 14, 15, 16 and Brillouin optical correlation-domain analysis (BOCDA)12, 17, 18, 19, 20, 21, 22, 23, 24, 25, whereas the latter category includes Brillouin optical time-domain reflectometry (BOTDR)26, 27, 28, 29, 30 and Brillouin optical correlation-domain reflectometry (BOCDR)31, 32, 33, 34, 35, 36, 37. For each technique, a number of different configurations have been proposed in order to improve the sensing performance in terms of spatial resolution, measurement range, signal-to-noise ratio (SNR), sampling rate, repetition rate, strain and temperature sensitivity, strain dynamic range, system simplicity and cost efficiency. Here, we focus on Brillouin sensors with a high sampling (or repetition) rate for distributed dynamic strain measurement.
To date, extremely high sampling rates have been achieved in BOTDA3, 4, 5, 6, 7, 8, 9, 10, 11 and BOCDA19, 20, 25. In particular, Voskoboinik et al.3 have proposed the idea of using multiple pumps and probes in BOTDA to avoid time-consuming frequency sweeping, thus leading to a potential increase in measurement speed at the expense of frequency granularity. Furthermore, inspired by the idea of Bernini et al.4, Peled et al.5 have developed slope-assisted BOTDA, which interrogates the FUT with a single frequency located at the middle of the slope of the local Brillouin gain spectrum (BGS), thereby enabling a single pump pulse to sample the strain distribution along the full length of the FUT. By synthesising an adaptable probe wave, an FUT with an arbitrary Brillouin frequency shift (BFS) distribution can be interrogated, though the strain dynamic range is limited by the size of the linear section of the BGS. Peled et al.6 have also demonstrated dynamic BOTDA measurements by implementing the fast switching of optical frequency using high-performance arbitrary-waveform generators, though the measurement time becomes longer as the length of the FUT increases. In addition, Taki et al.7 have proposed the use of cyclic pulse coding based on quasi-periodic bit sequences in BOTDA for an FUT with a length on the order of kilometres. This configuration allows for real-time decoding, resulting in a substantial reduction in the number of averages required to guarantee an acceptable SNR. Performance improvements for these distributed dynamic sensors based on BOTDA are still being aggressively studied8, 9, 10, 11.
As for BOCDA, Song et al.19 have achieved a 1 kHz sampling rate at a single location using a time-division pump-probe generation scheme with optimised temporal gates and an unbalanced Mach–Zehnder delay line. By using differential frequency modulation, they have additionally demonstrated a distributed measurement over a 100-m-long FUT at a 20-Hz repetition rate with a spatial resolution of 80 cm20. Random accessibility with a 5-kHz sampling rate has also been demonstrated by the employment of a high-speed lock-in amplifier25. In this manner, only analysis systems based not on spontaneous but on stimulated Brillouin scattering (SBS) have been exploited to acquire Brillouin signals on a real-time basis with sub-metre spatial resolution.
In general, SBS is induced by a so-called pump-probe configuration, which poses one major problem. This configuration requires the pump and probe lights to be injected into both ends of the FUT, which is not convenient for practically embedding the sensors into materials and structures; furthermore, the measurement can no longer be performed when extremely high loss or breakage occurs at a point along the FUT. Some quasi-one-end-access configurations16, 21, 22 have been developed by exploiting the Fresnel reflection at the open end of an FUT (cut perpendicularly to the fibre axis for high reflection, with a mirror or a small fibre loop sometimes being equipped), but they cannot perfectly overcome the measurement difficulties associated with a breakage point.
Against this background, it is of paramount importance to develop distributed Brillouin sensing technology that simultaneously provides a high sampling/repetition rate, intrinsically one-end-access interrogation, and high spatial resolution. BOTDR does not seem to be suitable because the spontaneous-Brillouin-scattered signal generated by incident optical pulses is so weak that the signal needs to be integrated numerous times26, 27, 28, 29, 30.
In this work, we describe a new configuration for BOCDR that satisfies these three requirements simultaneously. BOCDR is known to be an intrinsically one-end-access sensing technique with high-spatial resolution, but its highest sampling rate reported so far was lower than 20 Hz. Here, using a voltage-controlled oscillator (VCO), the frequency sweeping for acquiring the BGS is performed at high speed without using the inherent sweeping function of an electrical spectrum analyser (ESA), which is used only to detect the signal power at a fixed frequency component. The BGS is then approximated by a sinusoidal waveform, and the BFS is converted into its phase delay, which is further converted into a voltage for direct measurement. When a single-point measurement is performed at an arbitrary position, a maximal strain sampling rate of 100 kHz is experimentally verified by detecting dynamic strain at 1 kHz. When distributed measurements are performed at 100 points with 10 times averaging, a repetition rate of 100 Hz (corresponding to the maximal sampling rate (100 kHz) divided by the number of points and the number of averages) is then proved to be feasible by tracking a propagating mechanical wave. This ultrahigh-speed configuration functions at the sacrifice of the deteriorated measurement accuracy, lowered spatial resolution, and limited range of measurable strain, which are fully discussed.
Materials and methods
BOCDR31, 32 is known as a distributed sensing technique with intrinsic one-end accessibility and high spatial resolution. Its operating principle is based on the correlation control of continuous lightwaves in a self-heterodyne scheme. In other words, the pump light and the reference light (instead of the probe light used in BOCDA) are sinusoidally frequency-modulated at fm, producing periodical ‘correlation peaks’38 in the FUT. The measurement range dm, determined by the interval between the correlation peaks, is inversely proportional to fm, as follows32:
where c is the velocity of light in a vacuum and n is the refractive index of the fibre core. By sweeping fm, the correlation peak, i.e. the sensing position, can be scanned along the fibre to acquire a BGS or BFS distribution. The spatial resolution Δz is given by32:
where ΔνB is the Brillouin bandwidth (~30 MHz in silica single-mode fibres (SMFs)) and Δf is the modulation amplitude of the optical frequency. Note that Δf is practically limited to one half of the BFS of the FUT because of the noise caused by the Rayleigh scattering31, 32. To date, the highest sampling rate including data acquisition to the computer was 19 Hz33, and was limited by the sweeping speed of an ESA.
The experimental setup of the newly configured BOCDR is schematically shown in Figure 1. The fibre-optic parts were similar to those in a conventional setup35; a polarisation scrambler was employed in the pump path (see Supplementary Information for details). The heterodyned Brillouin signal was converted into an electrical signal using a photo detector (PD). The key point of high-speed BGS acquisition is the conversion of the BGS from the frequency domain to the time domain (Figure 2a). The PD output was mixed with the output of a VCO, the frequency of which was repeatedly swept over a range of several hundred megahertz using a function generator, resulting in the BGS scanned in the frequency domain. The power at a fixed frequency component (carefully chosen) of the mixed signal was then output from the ESA using a so-called zero-span mode, which repeatedly provided the BGS in the time domain. The BGS acquisition (determined by the repetition rate of the VCO) must be slower than the calculation time required for the BFS derivation from the BGS, and therefore we developed a method for simultaneously deriving the BFS and the BGS. The basic concept of the subsequent signal processing is shown in Figure 2b. The ESA output, which can be regarded as the BGS in the time domain, was approximated by a one-period sinusoidal waveform using a band-pass filter (BPF) with the same central frequency as the repetition rate of the VCO. At this stage, the BFS was in one-to-one correspondence with the phase delay of the sinusoidal waveform (at the expense of the limited range of measurable strain; only a phase delay smaller than 180° can be properly detected). The phase delay was then converted into a quasi-DC voltage using an electrical amplifier (AMP), an exclusive-OR (XOR) logic gate, and a low-pass filter (LPF), and was finally input to the computer via a sound board and monitored using a virtual oscilloscope triggered by the repetition frequency for the distributed measurement.
Brillouin Science And Information Theory Pdf Files Free
Results and discussion
Operation confirmation
First, the VCO operation with a BFS sampling rate of 100 kHz was confirmed (an even higher sampling rate was achievable if the SNR deterioration was admitted). When the frequency-control voltage was linearly applied to the VCO (blue line in Figure 3a), the output frequency was distorted (blue curve in Figure 3b) because of the nonlinear frequency dependence on the applied voltage. Therefore, we applied a pre-distorted voltage (red curve in Figure 3a) and obtained the linear output frequency (red line in Figure 3b). The output frequency was swept from 2.17 to 2.53 GHz, which was set to be sufficiently lower than the BGS (~10.8 GHz) of a silica SMF. If this frequency were close to the BGS, the differential component of their mixed signal would be located near DC and overlapped by its own folded spectrum, resulting in a significantly deteriorated SNR.
Next, we verified that the phase delay and the final DC voltage were in one-to-one correspondence with the applied strain. A 12.8-m-long silica SMF (composed of a 0.8-m-long pigtail of an optical circulator connected to a 12-m-long SMF using an FC/APC adaptor) was used as the FUT. Strains of up to 0.30% were applied to a 0.4-m-long section (6.6–7.0 m away from the circulator) (Figure 4). The open end was cut at an angle of 8° in order to suppress the Fresnel reflection. The modulation frequency fm was set to 7.222 MHz, with a correlation peak located at the midpoint of the strained section. The measurement range dm was calculated to be 14.2 m according to Equation (1). The modulation amplitude Δf was set to 0.35 GHz, resulting in a theoretical spatial resolution Δz of ~0.38 m from Equation (2). The ratio of the measurement range to the spatial resolution was ~37, which can be extended to ~570 simply by increasing Δf to half of the BFS, i.e. ~5.4 GHz (the Δf values were kept relatively low to avoid damage to the laser, which was not designed for large-amplitude modulation use). The 64th correlation peak was used, and the room temperature was 18 °C.
The raw and sinusoidally approximated BGSs with and without 0.07% strain are shown in Figure 5. The vertical axis was normalised so that the maximal and minimal powers of each spectrum became 1 and −1, respectively. Averaging was performed 10 times. The peaks of the raw BGSs, i.e. BFSs, were not completely the same as those of the approximated BGSs. This is natural considering that the approximation was performed using the whole shape of the BGS, and the strain can still be correctly detected so long as the actual BFS and the peak of the approximated BGS are in one-to-one correspondence (to be verified in the following paragraph). The raw and sinusoidally approximated BGSs with no averaging and 100 times averaging are also shown in Supplementary Fig. S1a and S1b, respectively. As the average number of times increased, the SNR was improved, which indicates that a 100 kHz sampling rate can be obtained if a low SNR is admitted.
The strain dependence of the normalised BGS approximated by a one-period sinusoidal waveform is then shown in Figure 6. Averaging was performed 10 times. The phase was delayed with increasing strain, but the dependence was not completely linear. The phase delay dependence on strain of up to 0.20% (Figure 7a) was almost linear with a slope of 952 degrees/% (calculated excluding the data at >0.20%, which deviated from the linear trend). Here, to evaluate the measurement accuracy, we displayed the error bars calculated at each strain as the standard deviations of 100 data plots. The possible error was not constant at different strains and was largest (+/−15.4°) at ~0.15% strain, because of the spectral noise structure unique to the ESA. In the worst case, the measurement error was approximately +/−10%. We admit that this value is not sufficiently low for accurate strain measurements, but this system is still useful in some relatively rough dynamic strain measurements.
The final DC voltage, i.e. the output from the LPF (cut-off frequency=1.2 kHz) was also plotted as a function of applied strain (Figure 7b). The voltage increased linearly with increasing strain (slope=3.2 V/%) and reached a maximum at ~0.2%, where the phase delay became 180°. At this stage, it decreased for strains beyond ~0.2%. The DC voltage was thus confirmed to correspond to the applied strain, but with a limited strain dynamic range.
Finally, we investigate the error of our phase-based detection, which is possibly induced by the modification of the BGS due to the irregular strain distribution in the vicinity of the correlation peak. In general, the BGS in the correlation-domain techniques consists of two spectral contributions: one is the sharp signal peak corresponding to the strain and the other is the sum of the noise substructure originating from all the other positions23. Therefore, the strain distribution at the positions near the correlation peak may modify the overall shape of the BGS, although the peak frequency is maintained. Figure 8 shows one of some extreme cases where large strains are applied near the correlation peak (refer to Supplementary Information for details), and one can see significant modification of the BGS. In order to quantify the possible error of the phase-based detection caused by such a ‘neighbourhood effect’, we performed some simulations using the method detailed in Ref. 24 with various strain distributions near the correlation peak. As detailed in Supplementary Information, we can conclude that if we allow an absolute error of +/−200 μɛ (or +/−10 MHz), the spatial resolution is effectively lowered by ~3 times. It should be also noted that the drop in performance is dependent on the measurement parameters such as the frequency sweep range, FUT length, and maximal applied strain. This finding will be a universal guideline, which is useful not only for this method but also for the other spectral-shape-based methods such as slope-assisted BOCDR37.
Demonstrations
The first demonstration was dynamic strain measurement. A 0.10% static strain was applied in advance to a 0.4-m-long section (6.6–7.0 m) of the 12.8-m-long silica SMF (Figure 4); the static strain was applied because it was difficult to stably apply dynamic strains to a non-strained section. Dynamic strains at 30 Hz, 100 Hz, 300 Hz and 1 kHz were then applied to the same section using a vibration generator, which makes it unnecessary to consider the neighbourhood effect discussed in the previous section. Averaging was performed 10 times. The other experimental conditions were the same as those for the operation confirmation described above. The measured temporal variations (deviation from the initial value) of the output voltage (Figure 9a–9d) indicate that the dynamic strains of up to 1 kHz were successfully detected, though the data at 1 kHz were somewhat distorted. The vibration amplitude, which is dependent on the vibration frequency, can be derived from Figure 7b; for instance, at 30 Hz, the applied strain ranges from 0.02% to 0.18%, corresponding to a vibration amplitude of 0.32 mm. This value was moderately consistent with the value directly measured by the laser Doppler velocimetry.
Subsequently, to demonstrate the system’s capability for ultrahigh-speed acquisition of the strain distribution, we attempted to track the propagation of a mechanical wave39 along the FUT. The structure of the FUT is depicted in Figure 9e, where a 3.2-m-long section (6.0–9.2 m) was tightly adhered to a 0.1-m-wide, 1-mm-thick rubber sheet using tape. Note that the newly employed 4.8-m-long SMF (0.8–5.6 m) had a slightly lower BFS (~10.8 GHz) than that of the 12-m-long SMF (5.6–17.6 m). A mechanical wave was manually generated (Figure 9f) and propagated along the rubber-adhered section at a relatively slow speed. The sampling rate of the BFS (or strain) was set to 100 kHz, and the repetition rate of the distributed measurement was set to 100 Hz. Averaging was performed 10 times. The measurement range dm was ~20.4 m (fm swept from 5.048 to 5.153 MHz; 24th correlation peak) and the nominal spatial resolution Δz was ~0.39 m (Δf=0.5 GHz; the neighbourhood effect was not considered).
The measured temporal variation of the strain distribution around the 3.2-m-long section attached to the rubber sheet (Figure 9g) shows that the mechanical wave propagation was detected, though the SNR was low. The propagation speed was calculated to be ~10 m s−1. The length of the strained section and the amplitude of the strain decreased as the mechanical wave propagated, which is consistent with the actual observation (Figure 9f). The amount of strain caused by the wave propagation (<0.03%) seems to be valid, as it is close to that measured by two-end-access BOTDA39. The amount of strain along the 4.8-m-long SMF with a lower BFS was not correctly detected (negative values obtained), which is reasonable considering the operating principle that limits the measurable strain range. The negative-strain (i.e. compressed) regions, which should appear at the foot of the peak39 were not clearly detected probably because of the combination of the following three reasons: (i) the dependence coefficient of the phase delay on negative strain (compression) was smaller, as the experimental setup was optimised for strain ranging from 0 to ~0.2%; (ii) the low-pass filtering at the end of the signal processing reduced the drop of the voltage; (iii) the properties of the materials (rubber, tape and so on) used to fabricate the FUT were different from those in Ref. 39. In this way, although the SNR was not sufficiently high, the ultrahigh-speed distributed strain sensing capability of this system was demonstrated.
Conclusions
In this work, real-time distributed Brillouin reflectometry with intrinsic one-end accessibility and high spatial resolution was developed for the first time by use of a correlation-domain technique. In this configuration, the BGS was converted from frequency domain to time domain by mixing it with the output of a VCO. Then, the BGS in the time domain was approximated by a one-period sinusoidal waveform, and the BFS was converted into its phase delay, which was subsequently converted into a voltage so that the BFS information can be directly obtained. A strain sampling rate of up to 100 kHz at an arbitrary position was experimentally verified by detecting locally applied dynamic strain at 1 kHz. A repetition rate of distributed measurement of 100 Hz (10 times averaging, 100 sensing points) was also verified by tracking a propagating mechanical wave. We must admit that, compared with those of other high-speed techniques, including slope-assisted BOTDA5, 6 and BOCDA19, 20, 25, the current performance of our method is low because of its major drawbacks, such as the relatively low measurement accuracy (in the worst case, approximately +/−10% error with 10 times averaging), spatial resolution degraded by a factor of ~3 (compared with standard BOCDR30, 31 or BOCDA17), and limited strain dynamic range (0 to ~0.2% in this experiment); these points could be a margin for improvement in the future. However, we anticipate that intrinsic one-end accessibility, the advantage of this system that other techniques cannot provide, will more than compensate for these shortcomings and that this will be a promising technique for distributed dynamic strain and temperature sensing with practical convenience, especially for a relatively short measurement range.
Author contributions
YM and KN designed the study. YM and NH performed the experiments. YM and HF analysed the data. YM and KYS performed the simulation. YM wrote the manuscript with input from all co-authors.
References
1.
Agrawal GP. Nonlinear Fibre Optics. New York: Academic Press; 2001.
2.
Horiguchi T, Tateda M. BOTDA—nondestructive measurement of single-mode optical fibre attenuation characteristics using Brillouin interaction: theory. J Lightwave Technol 1989; 7: 1170–1176.
3.
Voskoboinik A, Wang J, Shamee B, Nuccio SR, Zhang Let al. SBS-based fibre optical sensing using frequency-domain simultaneous tone interrogation. J Lightwave Technol 2011; 29: 1729–1735.
4.
Bernini R, Minardo A, Zeni L. Dynamic strain measurement in optical fibres by stimulated Brillouin scattering. Opt Lett 2009; 34: 2613–2615.
5.
Peled Y, Motil A, Yaron L, Tur M. Slope-assisted fast distributed sensing in optical fibres with arbitrary Brillouin profile. Opt Express 2011; 19: 19845–19854.
6.
Peled Y, Motil A, Tur M. Fast Brillouin optical time domain analysis for dynamic sensing. Opt Express 2012; 20: 8584–8591.
7.
Taki M, Muanenda Y, Oton CJ, Nannipieri T, Signorini Aet al. Cyclic pulse coding for fast BOTDA fibre sensors. Opt Lett 2013; 38: 2877–2880.
8.
Danon O, Motil A, Sovran I, Hadar R, Tur M. Real-time fast and distributed measurement of a Brillouin-inhomogeneous fibre using tailored-frequency probe in slope-assisted BOTDA. Proc SPIE 2014; 9157: 9157AM.
9.
Muanenda Y, Taki M, Pasquale FD. Long-range accelerated BOTDA sensor using adaptive linear prediction and cyclic coding. Opt Lett 2014; 39: 5411–5414.
10.
Sovran I, Motil A, Tur M. Frequency-scanning BOTDA with ultimately fast acquisition speed. IEEE Photonics Technol Lett 2015; 27: 1426–1429.
11.
Dong YK, Ba DX, Jiang TF, Zhou DW, Zhang HYet al. High-spatial-resolution fast BOTDA for dynamic strain measurement based on differential double-pulse and second-order sideband of modulation. IEEE Photonics J 2013; 5: 2600407.
12.
Elooz D, Antman Y, Levanon N, Zadok A. High-resolution long-reach distributed Brillouin sensing based on combined time-domain and correlation-domain analysis. Opt Express 2014; 22: 6453–6463.
13.
Garus D, Krebber K, Schliep F, Gogolla T. Distributed sensing technique based on Brillouin optical-fibre frequency-domain analysis. Opt Lett 1996; 21: 1402–1404.
14.
Bernini R, Minardo A, Zeni L. Distributed sensing at centimeter-scale spatial resolution by BOFDA: measurements and signal processing. IEEE Photonics J 2012; 4: 48–56.
15.
Minardo A, Bernini R, Zeni L. Distributed temperature sensing in polymer optical fibre by BOFDA. IEEE Photonics Technol Lett 2014; 26: 387–390.
16.
Wosniok A, Mizuno Y, Krebber K, Nakamura K. L-BOFDA: a new sensor technique for distributed Brillouin sensing. Proc SPIE 2013; 8794: 879431.
17.
Hotate K, Hasegawa T. Measurement of Brillouin gain spectrum distribution along an optical fibre using a correlation-based technique—proposal, experiment and simulation. IEICE Trans Electron 2000; E83-C: 405–412.
18.
Song KY, He ZY, Hotate K. Distributed strain measurement with millimeter-order spatial resolution based on Brillouin optical correlation domain analysis. Opt Lett 2006; 31: 2526–2528.
19.
Song KY, Hotate K. Distributed fibre strain sensor with 1-kHz sampling rate based on Brillouin optical correlation domain analysis. IEEE Photonics Technol Lett 2007; 19: 1928–1930.
20.
Song KY, Kishi M, He ZY, Hotate K. High-repetition-rate distributed Brillouin sensor based on optical correlation-domain analysis with differential frequency modulation. Opt Lett 2011; 36: 2062–2064.
21.
Song KY, Hotate K. Brillouin optical correlation domain analysis in linear configuration. IEEE Photonics Technol Lett 2008; 20: 2150–2152.
22.
Jeong JH, Chung KH, Lee SB, Song KY, Jeong JMet al. Linearly configured BOCDA system using a differential measurement scheme. Opt Express 2014; 22: 1467–1473.
23.
Song KY, He ZY, Hotate K. Optimization of Brillouin optical correlation domain analysis system based on intensity modulation scheme. Opt Express 2006; 14: 4256–4263.
24.
Song KY, He ZY, Hotate K. Effects of intensity modulation of light source on Brillouin optical correlation domain analysis. J Lightwave Technol 2007; 25: 1238–1246.
25.
Zhang CY, Kishi M, Hotate K. 5,000 points/s high-speed random accessibility for dynamic strain measurement at arbitrary multiple points along a fibre by Brillouin optical correlation domain analysis. Appl Phys Express 2015; 8: 042501.
26.
Kurashima T, Horiguchi T, Izumita H, Furukawa S, Koyamada Y. Brillouin optical-fibre time domain reflectometry. IEICE Trans Commun 1993; E76-B: 382–390.
27.
Alahbabi MN, Cho YT, Newson TP. 100 km distributed temperature sensor based on coherent detection of spontaneous Brillouin backscatter. Meas Sci Technol 2004; 15: 1544–1547.
28.
Geng J, Staines S, Blake M, Jiang SB. Distributed fibre temperature and strain sensor using coherent radio-frequency detection of spontaneous Brillouin scattering. Appl Opt 2007; 46: 5928–5932.
29.
Masoudi A, Belal M, Newson TP. Distributed dynamic large strain optical fibre sensor based on the detection of spontaneous Brillouin scattering. Opt Lett 2013; 38: 3312–3315.
Side two • ' Shockadelica': Originally written (unsolicited) by Prince in response to 's then-forthcoming album titled Shockadelica (1986) because that album had no song to match/complement what Prince felt was such a great album title. The track was finally released in 1989 as the B-side of ''. 'Feel U Up' was re-recorded in 1986 and the lyrics of both recordings are very similar. Prince purple rush rar. Both songs were re-recorded later.
30.
Tu GJ, Zhang XP, Zhang YX, Ying ZF, Lv LD. Strain variation measurement with short-time Fourier transform-based Brillouin optical time-domain reflectometry sensing system. Electron Lett 2014; 50: 1624–1626.
31.
Mizuno Y, Zou WW, He ZY, Hotate K. Proposal of Brillouin optical correlation-domain reflectometry (BOCDR). Opt Express 2008; 16: 12148–12153.
32.
Mizuno Y, Zou WW, He ZY, Hotate K. Operation of Brillouin optical correlation-domain reflectometry: theoretical analysis and experimental validation. J Lightwave Technol 2010; 28: 3300–3306.
33.
Mizuno Y, He ZY, Hotate K. One-end-access high-speed distributed strain measurement with 13-mm spatial resolution based on Brillouin optical correlation-domain reflectometry. IEEE Photonics Technol Lett 2009; 21: 474–476.
34.
Mizuno Y, He ZY, Hotate K. Measurement range enlargement in Brillouin optical correlation-domain reflectometry based on temporal gating scheme. Opt Express 2009; 17: 9040–9046.
Instantly copy and quote sections of content, including text, tables or images. FineReader’s ability to turn documents, however, complex their layouts, into editable files saves you significant time and effort. • Search and Archive Documents An essential tool for the paperless office, ABBYY FineReader 14 License Key allows you transform the information trapped in paper or images into actionable, searchable, digital content. ABBYY FineReader 14 Key Features: • Edit a Scanned Document or an Image PDF Edit and create new documents based on paper or image-only originals. • Extract Information From Paper Originals Quickly access content trapped in image-only PDFs and scans. Abbyy finereader 12 serial number crack for idm.
35.
Mizuno Y, He ZY, Hotate K. Stable entire-length measurement of fibre strain distribution by Brillouin optical correlation-domain reflectometry with polarization scrambling and noise-floor compensation. Appl Phys Express 2009; 2: 062403.
36.
Manotham S, Kishi M, He ZY, Hotate K. 1-cm spatial resolution with large dynamic range in strain distributed sensing by Brillouin optical correlation domain reflectometry based on intensity modulation. Proc SPIE 2012; 8351: 835136.
37.
Lee HY, Hayashi N, Mizuno Y, Nakamura K. Slope-assisted Brillouin optical correlation-domain reflectometry: proof of concept. IEEE Photonics J 2016; 8: 6802807.
38.
Hotate K, He ZY. Synthesis of optical-coherence function and its applications in distributed and multiplexed optical sensing. J Lightwave Technol 2006; 24: 2541–2557.
39.
Peled Y, Motil A, Kressel I, Tur M. Monitoring the propagation of mechanical waves using an optical fibre distributed and dynamic strain sensor based on BOTDA. Opt Express 2013; 21: 10697–10705.
Acknowledgements
We wish to acknowledge Tomohito Kawa, Heeyoung Lee, Shumpei Shimada, Makoto Shizuka, Kazunari Minakawa, Hiroki Tanaka, Wei Qiu, Sho Ikeda, Daisuke Yamane, Hiroyuki Ito, Shiro Dosho and Kazuya Masu (Institute of Innovative Research, Tokyo Institute of Technology) for their experimental assistance and Richard Nedelcov (Department of Language Arts, Tokyo University of the Arts) for his English editing. This work was supported by JSPS KAKENHI Grant Numbers 25709032, 26630180 and 25007652, and by research grants from the Iwatani Naoji Foundation, the SCAT Foundation and the Konica Minolta Science and Technology Foundation.
Author information
Affiliations
Institute of Innovative Research, Tokyo Institute of Technology, Midori-ku, Yokohama 226-8503, Japan
Yosuke Mizuno
& Kentaro Nakamura
Research Center for Advanced Science and Technology, The University of Tokyo, Meguro-ku, Tokyo 153-8904, Japan
Neisei Hayashi
Servo Laboratory, FANUC Corporation, Oshino-mura, Yamanashi 401-0597, Japan
Hideyuki Fukuda
Department of Physics, Chung-Ang University, Dongjak-gu, Seoul 06974, Korea
Kwang Yong Song
Authors
Nature Research journals•
PubMed•
Nature Research journals•
PubMed•
Nature Research journals•
PubMed•
Nature Research journals•
PubMed•
Nature Research journals•
PubMed•
Competing interests
The author declare no conflict of interest.
Corresponding author
Correspondence to Yosuke Mizuno.
Supplementary information
Rights and permissions
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/4.0/
To obtain permission to re-use content from this article visit RightsLink.
About this article
Publication history
Revised
Published
DOI
Note: Supplementary Information for this article can be found on the Light: Science & Applications’ website(http://www.nature.com/lsa).
Further reading
Potential of Discriminative Sensing of Strain and Temperature Using Perfluorinated Polymer FBG
IEEE Sensors Journal (2019)
Dynamic Strain Measurements Based on High-Speed Single-End-Access Brillouin Optical Correlation Domain Analysis
Infrared thermometry for breakage detection of optical fibers embedded in structures
Applied Physics Express (2019)
Distributed temperature sensing based on slope-assisted Brillouin optical correlation-domain reflectometry with over 10 km measurement range
Electronics Letters (2019)
Enhanced stability and sensitivity of slope-assisted Brillouin optical correlation-domain reflectometry using polarization-maintaining fibers
OSA Continuum (2019)
Article Tools
Fibre sensors: ultrafast sampling
An optical fibre sensing scheme that measures strain with a high spatial resolution and a very high sampling rate has been developed. Optical fibre sensors based on Brillouin scattering are promising for monitoring structural health. The system built by Yosuke Mizuno of Tokyo Institute of Technology and colleagues measures the frequency shift induced in the fibre’s Brillouin gain spectrum on stretching the fibre. This frequency shift is converted into a phase delay of a sinusoidal waveform, which enables the direct detection of the frequency shift. The approach allows single-point strain measurements to be performed at a rate of up to 100 kilohertz at any point along the fibre. Distributed measurements at multiple points along the fibre are also possible, although at lower repetition rates. Importantly, the scheme only requires access from one end of the fibre.
Information theory
Information theory studies the quantification, storage, and communication of information. It was originally proposed by Claude Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper titled 'A Mathematical Theory of Communication'. Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields.
The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, information engineering, and electrical engineering. The theory has also found applications in other areas, including statistical inference, natural language processing, cryptography, neurobiology,[1] human vision,[2] the evolution[3] and function[4] of molecular codes (bioinformatics), model selection in statistics,[5]thermal physics,[6]quantum computing, linguistics, plagiarism detection,[7]pattern recognition, and anomaly detection.[8] Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, Grey system theory and measures of information.
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL). Information theory is used in information retrieval, intelligence gathering, gambling, statistics, and even in musical composition.
A key measure in information theory is 'entropy'. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.
3Quantities of information
4Coding theory
4.1Source theory
4.2Channel capacity
5Applications to other fields
6See also
7References
Overview[edit]
Information theory studies the transmission, processing, extraction, and utilization of information. Abstractly, information can be thought of as the resolution of uncertainty. In the case of communication of information over a noisy channel, this abstract concept was made concrete in 1948 by Claude Shannon in his paper 'A Mathematical Theory of Communication', in which 'information' is thought of as a set of possible messages, where the goal is to send these messages over a noisy channel, and then to have the receiver reconstruct the message with low probability of error, in spite of the channel noise. Shannon's main result, the noisy-channel coding theorem showed that, in the limit of many channel uses, the rate of information that is asymptotically achievable is equal to the channel capacity, a quantity dependent merely on the statistics of the channel over which the messages are sent.[1]
Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions. Information theory is a broad and deep mathematical theory, with equally broad and deep applications, amongst which is the vital field of coding theory.
Coding theory is concerned with finding explicit methods, called codes, for increasing the efficiency and reducing the error rate of data communication over noisy channels to near the channel capacity. These codes can be roughly subdivided into data compression (source coding) and error-correction (channel coding) techniques. In the latter case, it took many years to find the methods Shannon's work proved were possible.
A third class of information theory codes are cryptographic algorithms (both codes and ciphers). Concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis. See the article ban (unit) for a historical application.
Historical background[edit]
The landmark event that established the discipline of information theory and brought it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper 'A Mathematical Theory of Communication' in the Bell System Technical Journal in July and October 1948.
Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability. Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying 'intelligence' and the 'line speed' at which it can be transmitted by a communication system, giving the relation W = K log m (recalling Boltzmann's constant), where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant. Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as H = log Sn = n log S, where S was the number of possible symbols, and n the number of symbols in a transmission. The unit of information was therefore the decimal digit, which has since sometimes been called the hartley in his honor as a unit or scale or measure of information. Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.
Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs. Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory.
In Shannon's revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that
DragonForce - Cry for Eternity 13. DragonForce - Fields Of Despair 11. DragonForce - Dawn Over A New World 12. Download lagu freya through the fire and flames. DragonForce - Body Breakdown 14.
'The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point.'
With it came the ideas of
the information entropy and redundancy of a source, and its relevance through the source coding theorem;
the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; as well as
the bit—a new way of seeing the most fundamental unit of information.
Quantities of information[edit]
Information theory is based on probability theory and statistics. Information theory often concerns itself with measures of information of the distributions associated with random variables. Important quantities of information are entropy, a measure of information in a single random variable, and mutual information, a measure of information in common between two random variables. The former quantity is a property of the probability distribution of a random variable and gives a limit on the rate at which data generated by independent samples with the given distribution can be reliably compressed. The latter is a property of the joint distribution of two random variables, and is the maximum rate of reliable communication across a noisy channel in the limit of long block lengths, when the channel statistics are determined by the joint distribution.
The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. A common unit of information is the bit, based on the binary logarithm. Other units include the nat, which is based on the natural logarithm, and the decimal digit, which is based on the common logarithm.
In what follows, an expression of the form p log p is considered by convention to be equal to zero whenever p = 0. This is justified because for any logarithmic base.
Entropy of an information source[edit]
Based on the probability mass function of each source symbol to be communicated, the Shannon entropyH, in units of bits (per symbol), is given by
where pi is the probability of occurrence of the i-th possible value of the source symbol. This equation gives the entropy in the units of 'bits' (per symbol) because it uses a logarithm of base 2, and this base-2 measure of entropy has sometimes been called the 'shannon' in his honor. Entropy is also commonly computed using the natural logarithm (base e, where e is Euler's number), which produces a measurement of entropy in 'nats' per symbol and sometimes simplifies the analysis by avoiding the need to include extra constants in the formulas. Other bases are also possible, but less commonly used. For example, a logarithm of base 28 = 256 will produce a measurement in bytes per symbol, and a logarithm of base 10 will produce a measurement in decimal digits (or hartleys) per symbol.
Intuitively, the entropy HX of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X when only its distribution is known.
The entropy of a source that emits a sequence of N symbols that are independent and identically distributed (iid) is N ⋅ H bits (per message of N symbols). If the source data symbols are identically distributed but not independent, the entropy of a message of length N will be less than N ⋅ H.
The entropy of a Bernoulli trial as a function of success probability, often called the binary entropy function, Hb(p). The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable, as in an unbiased coin toss.
If one transmits 1000 bits (0s and 1s), and the value of each of these bits is known to the receiver (has a specific value with certainty) ahead of transmission, it is clear that no information is transmitted. If, however, each bit is independently equally likely to be 0 or 1, 1000 shannons of information (more often called bits) have been transmitted. Between these two extremes, information can be quantified as follows. If ? is the set of all messages {x1, .., xn} that X could be, and p(x) is the probability of some , then the entropy, H, of X is defined:[9]
(Here, I(x) is the self-information, which is the entropy contribution of an individual message, and ?X is the expected value.) A property of entropy is that it is maximized when all the messages in the message space are equiprobable p(x) = 1/n; i.e., most unpredictable, in which case H(X) = log n.
The special case of information entropy for a random variable with two outcomes is the binary entropy function, usually taken to the logarithmic base 2, thus having the shannon (Sh) as unit:
Joint entropy[edit]
The joint entropy of two discrete random variables X and Y is merely the entropy of their pairing: (X, Y). This implies that if X and Y are independent, then their joint entropy is the sum of their individual entropies.
For example, if (X, Y) represents the position of a chess piece — X the row and Y the column, then the joint entropy of the row of the piece and the column of the piece will be the entropy of the position of the piece.
Despite similar notation, joint entropy should not be confused with cross entropy.
Conditional entropy (equivocation)[edit]
The conditional entropy or conditional uncertainty of X given random variable Y (also called the equivocation of X about Y) is the average conditional entropy over Y:[10]
Because entropy can be conditioned on a random variable or on that random variable being a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use. A basic property of this form of conditional entropy is that:
Mutual information (transinformation)[edit]
Mutual information measures the amount of information that can be obtained about one random variable by observing another. It is important in communication where it can be used to maximize the amount of information shared between sent and received signals. The mutual information of X relative to Y is given by:
where SI (Specific mutual Information) is the pointwise mutual information.
A basic property of the mutual information is that
That is, knowing Y, we can save an average of I(X; Y) bits in encoding X compared to not knowing Y.
Mutual information is symmetric:
Mutual information can be expressed as the average Kullback–Leibler divergence (information gain) between the posterior probability distribution of X given the value of Y and the prior distribution on X:
In other words, this is a measure of how much, on the average, the probability distribution on X will change if we are given the value of Y. This is often recalculated as the divergence from the product of the marginal distributions to the actual joint distribution:
Mutual information is closely related to the log-likelihood ratio test in the context of contingency tables and the multinomial distribution and to Pearson's χ2 test: mutual information can be considered a statistic for assessing independence between a pair of variables, and has a well-specified asymptotic distribution.
The Kullback–Leibler divergence (or information divergence, information gain, or relative entropy) is a way of comparing two distributions: a 'true' probability distributionp(X), and an arbitrary probability distribution q(X). If we compress data in a manner that assumes q(X) is the distribution underlying some data, when, in reality, p(X) is the correct distribution, the Kullback–Leibler divergence is the number of average additional bits per datum necessary for compression. It is thus defined
Although it is sometimes used as a 'distance metric', KL divergence is not a true metric since it is not symmetric and does not satisfy the triangle inequality (making it a semi-quasimetric).
Another interpretation of the KL divergence is the 'unnecessary surprise' introduced by a prior from the truth: suppose a number X is about to be drawn randomly from a discrete set with probability distribution p(x). If Alice knows the true distribution p(x), while Bob believes (has a prior) that the distribution is q(x), then Bob will be more surprised than Alice, on average, upon seeing the value of X. The KL divergence is the (objective) expected value of Bob's (subjective) surprisal minus Alice's surprisal, measured in bits if the log is in base 2. In this way, the extent to which Bob's prior is 'wrong' can be quantified in terms of how 'unnecessarily surprised' it is expected to make him.
Other quantities[edit]
Other important information theoretic quantities include Rényi entropy (a generalization of entropy), differential entropy (a generalization of quantities of information to continuous distributions), and the conditional mutual information.
Coding theory[edit]
A picture showing scratches on the readable surface of a CD-R. Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using error detection and correction.
Coding theory is one of the most important and direct applications of information theory. It can be subdivided into source coding theory and channel coding theory. Using a statistical description for data, information theory quantifies the number of bits needed to describe the data, which is the information entropy of the source.
Data compression (source coding): There are two formulations for the compression problem:
lossless data compression: the data must be reconstructed exactly;
lossy data compression: allocates bits needed to reconstruct the data, within a specified fidelity level measured by a distortion function. This subset of information theory is called rate–distortion theory.
Error-correcting codes (channel coding): While data compression removes as much redundancy as possible, an error correcting code adds just the right kind of redundancy (i.e., error correction) needed to transmit the data efficiently and faithfully across a noisy channel.
Science And Information Theory
This division of coding theory into compression and transmission is justified by the information transmission theorems, or source–channel separation theorems that justify the use of bits as the universal currency for information in many contexts. However, these theorems only hold in the situation where one transmitting user wishes to communicate to one receiving user. In scenarios with more than one transmitter (the multiple-access channel), more than one receiver (the broadcast channel) or intermediary 'helpers' (the relay channel), or more general networks, compression followed by transmission may no longer be optimal. Network information theory refers to these multi-agent communication models.
Source theory[edit]
Any process that generates successive messages can be considered a source of information. A memoryless source is one in which each message is an independent identically distributed random variable, whereas the properties of ergodicity and stationarity impose less restrictive constraints. All such sources are stochastic. These terms are well studied in their own right outside information theory.
Rate[edit]
Information rate is the average entropy per symbol. For memoryless sources, this is merely the entropy of each symbol, while, in the case of a stationary stochastic process, it is
that is, the conditional entropy of a symbol given all the previous symbols generated. For the more general case of a process that is not necessarily stationary, the average rate is
that is, the limit of the joint entropy per symbol. For stationary sources, these two expressions give the same result.[11]
It is common in information theory to speak of the 'rate' or 'entropy' of a language. This is appropriate, for example, when the source of information is English prose. The rate of a source of information is related to its redundancy and how well it can be compressed, the subject of source coding.
Channel capacity[edit]
Communications over a channel—such as an ethernetcable—is the primary motivation of information theory. As anyone who's ever used a telephone (mobile or landline) knows, however, such channels often fail to produce exact reconstruction of a signal; noise, periods of silence, and other forms of signal corruption often degrade quality.
Consider the communications process over a discrete channel. A simple model of the process is shown below:
Here X represents the space of messages transmitted, and Y the space of messages received during a unit time over our channel. Let p(y|x) be the conditional probability distribution function of Y given X. We will consider p(y|x) to be an inherent fixed property of our communications channel (representing the nature of the noise of our channel). Then the joint distribution of X and Y is completely determined by our channel and by our choice of f(x), the marginal distribution of messages we choose to send over the channel. Under these constraints, we would like to maximize the rate of information, or the signal, we can communicate over the channel. The appropriate measure for this is the mutual information, and this maximum mutual information is called the channel capacity and is given by:
This capacity has the following property related to communicating at information rate R (where R is usually bits per symbol). For any information rate R < C and coding error ε > 0, for large enough N, there exists a code of length N and rate ≥ R and a decoding algorithm, such that the maximal probability of block error is ≤ ε; that is, it is always possible to transmit with arbitrarily small block error. In addition, for any rate R > C, it is impossible to transmit with arbitrarily small block error.
Channel coding is concerned with finding such nearly optimal codes that can be used to transmit data over a noisy channel with a small coding error at a rate near the channel capacity.
Capacity of particular channel models[edit]
A continuous-time analog communications channel subject to Gaussian noise — see Shannon–Hartley theorem.
A binary symmetric channel (BSC) with crossover probability p is a binary input, binary output channel that flips the input bit with probability p. The BSC has a capacity of 1 − Hb(p) bits per channel use, where Hb is the binary entropy function to the base-2 logarithm:
A binary erasure channel (BEC) with erasure probability p is a binary input, ternary output channel. The possible channel outputs are 0, 1, and a third symbol 'e' called an erasure. The erasure represents complete loss of information about an input bit. The capacity of the BEC is 1 − p bits per channel use.
Applications to other fields[edit]
Intelligence uses and secrecy applications[edit]
Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the ban, was used in the Ultra project, breaking the German Enigma machine code and hastening the end of World War II in Europe. Shannon himself defined an important concept now called the unicity distance. Based on the redundancy of the plaintext, it attempts to give a minimum amount of ciphertext necessary to ensure unique decipherability.
Information theory leads us to believe it is much more difficult to keep secrets than it might first appear. A brute force attack can break systems based on asymmetric key algorithms or on most commonly used methods of symmetric key algorithms (sometimes called secret key algorithms), such as block ciphers. The security of all such methods currently comes from the assumption that no known attack can break them in a practical amount of time.
Information theoretic security refers to methods such as the one-time pad that are not vulnerable to such brute force attacks. In such cases, the positive conditional mutual information between the plaintext and ciphertext (conditioned on the key) can ensure proper transmission, while the unconditional mutual information between the plaintext and ciphertext remains zero, resulting in absolutely secure communications. In other words, an eavesdropper would not be able to improve his or her guess of the plaintext by gaining knowledge of the ciphertext but not of the key. However, as in any other cryptographic system, care must be used to correctly apply even information-theoretically secure methods; the Venona project was able to crack the one-time pads of the Soviet Union due to their improper reuse of key material.
Pseudorandom number generation[edit]
Pseudorandom number generators are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software. A class of improved random number generators is termed cryptographically secure pseudorandom number generators, but even they require random seeds external to the software to work as intended. These can be obtained via extractors, if done carefully. The measure of sufficient randomness in extractors is min-entropy, a value related to Shannon entropy through Rényi entropy; Rényi entropy is also used in evaluating randomness in cryptographic systems. Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses.
Brillouin Science And Information Theory Pdf Files Pdf
Seismic exploration[edit]
One early commercial application of information theory was in the field of seismic oil exploration. Work in this field made it possible to strip off and separate the unwanted noise from the desired seismic signal. Information theory and digital signal processing offer a major improvement of resolution and image clarity over previous analog methods.[12]
Semiotics[edit]
SemioticiansDoede Nauta and Winfried Nöth both considered Charles Sanders Peirce as having created a theory of information in his works on semiotics.[13]:171[14]:137 Nauta defined semiotic information theory as the study of 'the internal processes of coding, filtering, and information processing.'[13]:91
Concepts from information theory such as redundancy and code control have been used by semioticians such as Umberto Eco and Ferruccio Rossi-Landi to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.[15]
Miscellaneous applications[edit]
Information theory also has applications in gambling and investing, black holes, and bioinformatics.
See also[edit]
Constructor theory - a generalization of information theory that includes quantum information
Applications[edit]
History[edit]
Theory[edit]
Concepts[edit]
Decoder
Pointwise mutual information (PMI)
References[edit]
^ abF. Rieke; D. Warland; R Ruyter van Steveninck; W Bialek (1997). Spikes: Exploring the Neural Code. The MIT press. ISBN978-0262681087.
^Delgado-Bonal, Alfonso; Martín-Torres, Javier (2016-11-03). 'Human vision is determined based on information theory'. Scientific Reports. 6 (1). Bibcode:2016NatSR..636038D. doi:10.1038/srep36038. ISSN2045-2322. PMC5093619.
^cf; Huelsenbeck, J. P.; Ronquist, F.; Nielsen, R.; Bollback, J. P. (2001). 'Bayesian inference of phylogeny and its impact on evolutionary biology'. Science. 294 (5550): 2310–2314. Bibcode:2001Sci..294.2310H. doi:10.1126/science.1065889.
^Allikmets, Rando; Wasserman, Wyeth W.; Hutchinson, Amy; Smallwood, Philip; Nathans, Jeremy; Rogan, Peter K. (1998). 'Thomas D. Schneider], Michael Dean (1998) Organization of the ABCR gene: analysis of promoter and splice junction sequences'. Gene. 215 (1): 111–122. doi:10.1016/s0378-1119(98)00269-8.
^Burnham, K. P. and Anderson D. R. (2002) Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach, Second Edition (Springer Science, New York) ISBN978-0-387-95364-9.
^Jaynes, E. T. (1957). 'Information Theory and Statistical Mechanics'. Phys. Rev. 106 (4): 620. Bibcode:1957PhRv.106.620J. doi:10.1103/physrev.106.620.
^Bennett, Charles H.; Li, Ming; Ma, Bin (2003). 'Chain Letters and Evolutionary Histories'. Scientific American. 288 (6): 76–81. Bibcode:2003SciAm.288f.76B. doi:10.1038/scientificamerican0603-76. PMID12764940.
^David R. Anderson (November 1, 2003). 'Some background on why people in the empirical sciences may want to better understand the information-theoretic methods'(PDF). Archived from the original(pdf) on July 23, 2011. Retrieved 2010-06-23.
^Fazlollah M. Reza (1994) [1961]. An Introduction to Information Theory. Dover Publications, Inc., New York. ISBN0-486-68210-2.
^Robert B. Ash (1990) [1965]. Information Theory. Dover Publications, Inc. ISBN0-486-66521-6.
^Jerry D. Gibson (1998). Digital Compression for Multimedia: Principles and Standards. Morgan Kaufmann. ISBN1-55860-369-7.
^ abNauta, Doede (1972). The Meaning of Information. The Hague: Mouton. ISBN9789027919960.
^Nöth, Winfried (January 2012). 'Charles S. Peirce's theory of information: a theory of the growth of symbols and of knowledge'. Cybernetics and Human Knowing. 19 (1–2): 137–161.
^Nöth, Winfried (1981). 'Semiotics of ideology'. Semiotica, Issue 148.
The classic work[edit]
Shannon, C.E. (1948), 'A Mathematical Theory of Communication', Bell System Technical Journal, 27, pp. 379–423 & 623–656, July & October, 1948. PDF. Notes and other formats.
R.V.L. Hartley, 'Transmission of Information', Bell System Technical Journal, July 1928
Andrey Kolmogorov (1968), 'Three approaches to the quantitative definition of information' in International Journal of Computer Mathematics.
Other journal articles[edit]
J. L. Kelly, Jr., Betbubbles.com, 'A New Interpretation of Information Rate' Bell System Technical Journal, Vol. 35, July 1956, pp. 917–26.
R. Landauer, IEEE.org, 'Information is Physical' Proc. Workshop on Physics and Computation PhysComp'92 (IEEE Comp. Sci.Press, Los Alamitos, 1993) pp. 1–4.
R. Landauer, IBM.com, 'Irreversibility and Heat Generation in the Computing Process' IBM J. Res. Dev. Vol. 5, No. 3, 1961
Timme, Nicholas; Alford, Wesley; Flecker, Benjamin; Beggs, John M. (2012). 'Multivariate information measures: an experimentalist's perspective'. arXiv:1111.6857 [cs.IT].
Textbooks on information theory[edit]
Arndt, C. Information Measures, Information and its Description in Science and Engineering (Springer Series: Signals and Communication Technology), 2004, ISBN978-3-540-40855-0
Ash, RB. Information Theory. New York: Interscience, 1965. ISBN0-470-03445-9. New York: Dover 1990. ISBN0-486-66521-6
Gallager, R. Information Theory and Reliable Communication. New York: John Wiley and Sons, 1968. ISBN0-471-29048-3
Goldman, S. Information Theory. New York: Prentice Hall, 1953. New York: Dover 1968 ISBN0-486-62209-6, 2005 ISBN0-486-44271-3
Cover, Thomas; Thomas, Joy A. (2006). Elements of information theory (2nd ed.). New York: Wiley-Interscience. ISBN0-471-24195-4.
Csiszar, I, Korner, J. Information Theory: Coding Theorems for Discrete Memoryless Systems Akademiai Kiado: 2nd edition, 1997. ISBN963-05-7440-3
MacKay, David J. C. Information Theory, Inference, and Learning Algorithms Cambridge: Cambridge University Press, 2003. ISBN0-521-64298-1
Mansuripur, M. Introduction to Information Theory. New York: Prentice Hall, 1987. ISBN0-13-484668-0
McEliece, R. The Theory of Information and Coding'. Cambridge, 2002. ISBN978-0521831857
Pierce, JR. 'An introduction to information theory: symbols, signals and noise'. Dover (2nd Edition). 1961 (reprinted by Dover 1980).
Reza, F. An Introduction to Information Theory. New York: McGraw-Hill 1961. New York: Dover 1994. ISBN0-486-68210-2
Shannon, Claude; Weaver, Warren (1949). The Mathematical Theory of Communication(PDF). Urbana, Illinois: University of Illinois Press. ISBN0-252-72548-4. LCCN49-11922.
Stone, JV. Chapter 1 of book 'Information Theory: A Tutorial Introduction', University of Sheffield, England, 2014. ISBN978-0956372857.
Yeung, RW. A First Course in Information Theory Kluwer Academic/Plenum Publishers, 2002. ISBN0-306-46791-7.
Yeung, RW. Information Theory and Network Coding Springer 2008, 2002. ISBN978-0-387-79233-0
Other books[edit]
Leon Brillouin, Science and Information Theory, Mineola, N.Y.: Dover, [1956, 1962] 2004. ISBN0-486-43918-6
James Gleick, The Information: A History, a Theory, a Flood, New York: Pantheon, 2011. ISBN978-0-375-42372-7
A. I. Khinchin, Mathematical Foundations of Information Theory, New York: Dover, 1957. ISBN0-486-60434-9
H. S. Leff and A. F. Rex, Editors, Maxwell's Demon: Entropy, Information, Computing, Princeton University Press, Princeton, New Jersey (1990). ISBN0-691-08727-X
Robert K. Logan. What is Information? - Propagating Organization in the Biosphere, the Symbolosphere, the Technosphere and the Econosphere, Toronto: DEMO Publishing.
Tom Siegfried, The Bit and the Pendulum, Wiley, 2000. ISBN0-471-32174-5
Charles Seife, Decoding the Universe, Viking, 2006. ISBN0-670-03441-X
Jeremy Campbell, Grammatical Man, Touchstone/Simon & Schuster, 1982, ISBN0-671-44062-4
Henri Theil, Economics and Information Theory, Rand McNally & Company - Chicago, 1967.
Escolano, Suau, Bonev, Information Theory in Computer Vision and Pattern Recognition, Springer, 2009. ISBN978-1-84882-296-2
Vlatko Vedral, Decoding Reality: The Universe as Quantum Information, Oxford University Press 2010. ISBN0-19-923769-7
MOOC on information theory[edit]
Raymond W. Yeung, 'Information Theory' (The Chinese University of Hong Kong)
External links[edit]
Wikiquote has quotations related to: Information theory
Library resources about Information theory
Hazewinkel, Michiel, ed. (2001) [1994], 'Information', Encyclopedia of Mathematics, Springer Science+Business Media B.V. / Kluwer Academic Publishers, ISBN978-1-55608-010-4
Lambert F. L. (1999), 'Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms - Examples of Entropy Increase? Nonsense!', Journal of Chemical Education
IEEE Information Theory Society and ITSOC Monographs, Surveys, and Reviews
Retrieved from 'https://en.wikipedia.org/w/index.php?title=Information_theory&oldid=912194803'