2011 Vol. 36, No. 12
The recent research of wikipediais is firs briefly analyzed,especially on the statistics of quality of articles in Wikipedia.Then the automatic evaluating methods of article quality are discussed.The methods mainly include two kinds: the correlation-based analysis and cooperation modeling.Furthermore,we present the open problems of automatic quality evaluation and the possiblepromotions of collective intelligence.
Traffic volume patterns and their temporal evolution are one of the most important issues for traffic prediction and traffic condition estimation.However,little work has been conducted on identifying and associating traffic pattern occurrence with prevailing traffic conditions.In order to extract the patterns hidden in traffic volume fluctuation as well as their temporal evolution,we propose a three-layer strategy that first segments the volume into subsequences.Then,we use the recurrence qualification analysis to determine the statistical characteristics of the subsequences and the k-means clustering is used to get the hidden traffic patterns finally.A case study using three typical weekly traffic volume data acquired from a freeway in Minnesota of USA shows that the proposed method is useful for identification of the traffic pattern,and traffic prediction as well.
By introducing Clifford Algebra as the theoretical foundation and mathematical tools,a prototype temporal spatial analysis software system was proposed.The characteristics of this system can be summarized as following.① Under the premise of keeping the compatibilities with the commonly used GIS data,a new type of temporal-spatial data model,which unified the expression of both temporal,spatial and attribution components within the multivector structure,was proposed.② Geometric and metric operator libraries were defined,which can support multidimensional temporal spatial analysis.③ Plugging based temporal spatial analysis model constructing and integrating framework was implemented.Typical GIS temporal spatial analysis algorithms like multidimensional V-neighbor analysis,minimum union analysis and unified spatial-temporal process analyses with spacetime algebra were implemented and integrated.Results suggest that the proposed system can support multidimensional temporal spatial analysis effectively,which can also provide reference on improving research on unified temporal spatial analysis methods and GIS systems.
A self-detection watermarking algorithm for vector data was proposed using logical blocking and quantization ways.Then,the experiments for testing the robustness of self-detection were conducted.And a research was made on the application of the proposed self-detection algorithm to the area of improving watermarking detection efficiency.The experimental results show that the algorithm is with good robust and can resist some usual attacks such as data compressing,adding,deleting,editing,clipping,translating.At the same time,the filtering function of the self-detection can effectively improve the watermarking detection efficiency.
The data granularity and structure difference among the disk storage objects,memory management objects and rendering cache objects results to heavy counts of disk I/Os.This becomes one of the reasons causing data access delay.We propose a data organization method to uniform the granularity and structure in disk,memory and display cache.Firstly,we use the spatial index node from a hierarchical-nested and hybrid spatial index to uniform the disk storage,memory management and rendering cache basic operation unit.Secondly,the data structure is formated in disk and memory based on the structure of rendering cache objects.And the data layout method is extended to the object level using in disk storage to reduce the data format convention and disk I/O seeking time.Tests prove the obvious reduction of I/Os in the real-time rendering of large-scale 3D city models.
With spatial autocorrelation statistics,we reveal the spatial distribution of incidence of in 2008.Because spatial weight matrix greatly affects spatial autocorrelation,various measurement matrixes are adopted to conduct global and local analysis.The results indicate that: ① the distance matrix is more powerful to describe spatial distribution of HFRS than the adjacency matrix.② The incidence of HFRS implies significant spatial correlation when the distance threshold lies within 500 km and 800 km.③ From a local perspective,high incidences are clustered significantly around Jilin Province,while Xinjiang,Guangxi,Qinghai and Tibet with low incidences are surrounded by the provinces with high incidences.
There are lots of fuzzy phenomena in real world,and scale effect exists in modeling those phenomena,so high order vagueness exists in geographical world.The modeling of high order fuzzy object and their direction relation is complex and difficult.The direction representation of high order fuzzy geographical object is analyzed based on interval Type II fuzzy set.And the direction space is used to describe the direction membership value,and the divided direction space is used to replace initial direction space.The analysis method of direction relation between high order fuzzy objects is studied,and membership value error representations of direction space and direction relation are proposed.
In view of the disadvantages of the coastline generalization using the simplification index of bend height and bend depth,a new method based on the skeleton line of curve bends is introduced.Each bend is identified through the monotone curve,and the bend skeleton lines are extracted through the construction of triangulated network.Experiments based on the generation principles of unilateral coastline show the method is effective and feasible to keep the coastline shape coincident.
The Douglas-Peucker polyline simplification algorithm has been widely adopted in map generalization for decades,though it is often criticized for its low performance.As multi-core processor computers become widely available,it might be a good opportunity to improve the performance by converting its sequential implementation to parallel form.We present three different parallel implementations of the Douglas-Peucker algorithm.The first is in either recursive or non-recursive manner using OpenMP.The second is done by splitting a polyline feature into irrelevant segments and distributing segments to parallel threads.The third method is to dispatch each polyline feature to an idle parallel thread in which the conventional sequential method is applied.By utilizing the official China's provincial boundary geospatial data set,and C++ language for programming,performances on various multi-core processor computers are compared among the three implementations together with the original sequential forms.We prove that with the increment of processor's cores and the number of threads accordingly,the parallel algorithms will efficiently reduce the number of vertex of a polyline and generate multi-resolution polyline data,which dramatically speed up the process of map generalization and thus real-time display effects are achieved.
According to the basic requirements of digital watermark for remote sensing image,an algorithm for digital watermark based on pseudo-random sequence and DCT for remote sensing image was proposed.Firstly,the original image was segmented,and the image blocks were selected by the pseudo-random sequence.Then,the low frequency coefficients after DCT were got,and the watermark information was embedded in the low frequency coefficients of each image block using the quantitative method.Finally,the errors of embedded image were controlled by the accuracy constraints.Because the watermark information embedded was done with quantitative method,the watermarked image authentication needed no information about the original image.This algorithm realized blind extraction of digital watermarking.The experimental results show that the proposed algorithm the robustness and achieve near-zero change of remote sensing image.
Existing OpenMP and thread-level parallel algorithms simply do not work when speedup the processing chain of remote sensing image.The pipeline is used to process this complex image processing chain on multi-core platforms.A parallel image processing method based on pipeline is presented.The realization of the software pipeline and how the large mission is divided to meet processing based on pipeline are discussed in detail.The advantages of the pipeline technology used in the chain-processing of remote sensing image are proved in the experiment.
Focusing on the problem of missing and false fire detection caused by environmental changes when using traditional forest and grassland fixed threshold fire detection algorithm,we propose a space-time dynamic threshold fire detection algorithm considering the spatial and temporal factors,such as season,location and type of land cover.Based on the HJ-1A/B data of the native environment and disaster monitoring and forecasting of small satellite constellation,we selecte the fire case in Heilongjiang Province.Experimental results show that the space-time dynamic threshold fire detection algorithm with the space-time adaptive feature can improve the results of fire detection.
A new automatic target detection algorithm is proposed to detect infrared ship in complex sea-sky background.To procure the potential target regions,in the method,a combined high-pass filter in frequency domain is designed to filter original image.Afterwards,a scale-adaptive local threshold segmentation method is presented for extracting integral possible ship based on the potential target regions.Waterline detection is employed to distinguish correct ship finally.As the new method does not depend on the information of sea-sky line position and high brightness ship chimney or engine,its performance has been improved greatly.Experimental results indicate that the proposed method not only demonstrates reliable target detection capability for infrared ship in complex sea-sky background,but also shows efficiency.
Locally excitatory globally inhibitory oscillator network(LEGION),a neural oscillator network based on biologically framework,is applied to remote sensing image segmentation.For remote sensing images with complicate environment,a simplified LEGION algorithm with improved anisotropic diffusion method is presented to overcome sensitiveness.In multi-spectral images all the pixel values of the same location is considered as a vector.The connection weight between two oscillators is determined by a Mahalanobis distance metric between the two corresponding vectors.Experiments on real panchromatic image show that the proposed method can effectively produce more accurate segmentation than traditional algorithms.And the extending LEGION image segmentation algorithm can also be applied to process multi-spectral images segmentation.
The net primary productivity of vegetation of Wuhan City in 2009 was estimated by the CASA model with measured photosynthetic active radiation data,normalized difference vegetation index and Land Cover data of Moderate Resolution Imaging Spectroradiometer(MODIS),as well as meteorological data etc.The results shows that the NPP of the unit area of vegetation is 464.19 gC·m-2·a-1.And,the accumulated NPP of June to August is 56.8% of the gross NPP of the full year,which is the highest compare to that of other months.On the contrary the accumulated NPP of December to February is 5.6% of the gross NPP,which is the lowest compare to that of other months.Further more,the gross NPP of Huangpi is larger than 1 000 gC·m-2·a-1 due to its large area of forest cover.Conversely,the gross NPP of peri-urban area is lower than 400 gC·m-2·a-1 because of the shortage of vegetation cover.
3D model reconstructed by point cloud triangulation has some edge burrs.Taking many lines and planes information of city building into account,we suggest retrieving 3D framework model of city building facade based on generalized point photogrammetry theory,which is a sublimation for the contemporary line photogrammetry and the contemporary infinite point(vanishing point) theory.Firstly,initial values of image orientation element can be got by vanishing points.Then,straight-lines of object space are achieved by matched lines.Next,bundle block adjustment based on generalized point(line) is presented to solve orientation element.Finally,3D framework of city building is reconstructed by means of space edges recognition and mapping between texture and model space.Experimental results show that facade reconstruction based on line and plane feature is not only concise,high efficiency,high automation,but also is better than classic 3D facade reconstruction based on point cloud in that distinguishing.And clear 3D edges can be acquired by the proposed approaches.
A new method to optimize and select common master images based on the minimum sum of three baselines is presented and implemented,which further integrates and optimizes three main parameters of impacting the correlation of interferograms on the basis of the existing models.A contract experiment about optimal selection of common master images in PS InSAR is carried out with 19 ERS-1/2 SLC SAR images and 10 ENVISAT ASAR images.Theoretical analysis and the experimental results show that,compared with the existing method,the new method to optimize and select common master images is simple and practical,and the selected common master images are more reasonable and accurate.
For permanent scatterer and small baseline subsets need no less than 20 SAR images to obtain reliable results,time series interferogram stacking method is used to detect the ground subsidence of Yancheng City.In order to minimize the effects of atmospheric delay error,structure function is used to extract the interferograms which have small atmospheric effects.The ground subsidence of Yancheng City during the period of 2003-12~2009-05 is obtained by 13 ASAR images.The results show that there are 4 subsidence funnels in Yancheng downtown during the period of 2003-12~2005-11.New subsidence funnels appear and local subsidence rate increases during the period of 2008-12~2009-05.
During the Antarctic summer of 2008-2009,China's the 25th Antarctic scientific expedition team established the high precision gravity base network over Zhongshan Station and its neighboring area Larsemann Hills in Antarctica,using A-10 portable absolute gravimeter and LaCoste & Romberg G relative gravimeter.The network is composed of three absolute gravity stations and 10 relative gravity points.The accuracy of absolute and relative gravimetry is better than 7.5×10-8 m·s-2 and 20×10-8 m·s-2 respectively.
Both the relative and absolute models of antenna phase center(IGS_01 and IGS05) are described.Influence of these two models on PPP estimation,including coordinates,Zenith path delay(ZPD) and receiver clock biases are analyzed.Results show that the difference of ZPD estimates is about 5 mm,and that of estimated receiver clock bias and height of site is relatively around 3 ns and 1 cm,respectively,based on different antenna phase center models.In addition,with the absolute model,the accuracy of PPP-derived ZPD is better than 5 mm,and the horizontal positioning accuracy arrives at 1 cm,while the accuracy of estimated clock bias is about 0.1 ns.
Based on the fact that GPS and GLONASS measurements contain hardware delay biases,the mathematic model of the combined GPS and GLONASS precise point positioning(PPP) is derived and the influence of the hardware delay on estimated unknown parameters is further analyzed.The existence of the hardware delay biases is validated by results obtained from several IGS stations.The combined GPS/GLONASS PPP model is tested using IGS static data and kinematic data collected from an experiment.The obtained numerical results are analyzed with a comparison to that of GPS-only PPP.
Using Kalman filtering algorithm in GPS-based orbit determination for LEO,we have to deal with an observation mode and a dynamic model.GPS broadcast ephemeris algorithm will be regarded as the dynamic model of kinematic filtering.Firstly,the fitting algorithm for GPS broadcast ephemeris parameters is introduced.Then,the advantages and disadvantages of geometric orbit determination are analyzed.Finally,a new adaptively robust synthetic orbit determination algorithm is developed based on the predicting characteristic of GPS broadcast ephemeris algorithm and the adaptively robust filtering principle.The results show that the adaptively robust synthetic orbit determination algorithm can not only make good use of the geometric observation information but also reasonably adjust the contributions of the geometric observations and ephemeris predicted information to the filtering solution.
We analyzed the impact of the geomagnetic gradient on aircraft magnetic filed,and solved the aircraft magnetic filed model with flight test data of aircraft magnetic compensation and calculated value of geomagnetic gradient.The solving results of considering geomagnetic gradient and without considering geomagnetic gradient were compared.Then the compensation simulation test of interference magnetic field was carried out,and the interference magnetic field was caused by tumble,pitch and yaw of aircraft in 0°,90°,180°,270°courses.The results of simulation show that absolute error of model coefficient caused by geomagnetic gradient can reach as large as 5.24,and the improve rate(IR) will reduce 35.3%.
GOCE has released gravity gradients,but there aren't data in polar regions because the inclination of the orbit is about 96.7°.We need to simulate data with known gravity field model in this area,whereas many formulas to compute the gravity gradients are singular at poles,which can cause a lot of difficulties in gravity field recovery using GOCE data.In order to overcome the so-called singularities,we firstly analyze the singular term,then derives a new nonsingular expression for gravitational gradients calculation based on properties of Legendre function.At last we compare several different formulas through calculation.The proposed formulas is more accurate than conventional methods.
The tidal level observed value not only includes the real tidal level information,simultaneously also possibly contains gross errors and the system errors,which seriously affect the quality of tidal data and its application.In order to dispel the wrong observed value,and obtain the real tidal level information,we propose an integrated detection and repair method,which is based on the tide harmonic analysis and the theory of probability and the mathematical statistic.The experiment according to the long-period tidal gauge collection's tidal level value has very good performance.
As the noise in orbit-derived accelerations is colored,the decorrelation filter based on ARMA(auto regressive moving average) model is proposed to suppress the noise.The principle of the acceleration approach and the least-squares solution using decorrelation algorithm are introduced.And the mathematic model to simultaneously estimate the accelerometer scale factors,bias parameters and geopotential coefficients are also derived.Furthermore,the specific scheme for processing the colored noise of satellite accelerations based on ARMA filter is discussed.The Earth's gravity field model WHUCHAMP-ACC60KP up to degree 60 is recovered from 46-day CHAMP kinematic orbits and accelerometer data.Compared with the models of EGM96,EIGEN-1S,EIGEN-2 and EIGEN-CHAMP03S,the results show that WHUCHAMP-ACC60KP is more accurate than EIGEN-1S,and its total accuracy is near to the model of EIGEN-2,which validates the effectiveness of the proposed method.
Based on the characteristics of TurboEdit algorithm for GPS observation data cycle slips detection,a fixed length sliding window fitting model which makes the improvement to Geometry-Free combination method has been designed.A cycle slip repair algorithm which uses least square Chebyshev polynomial fitting method to repair cycle slips is proposed.The experimental results show that the improved TurboEdit algorithm can detect equal-cycle such as one cycle of small cycle slips,big cycle slips and successive small and big cycle slips,The least square Chebyshev polynomial fitting can also repair cycle slips more precisely and thus lays the foundation for subsequent data processing.
Churn is the inherent property of P2P networks.Dramatic Churn will cause a long lookup delay time and even partition DHTs.So how to shorten the lookup delay time and improving the lookup success ratio in churned DHT is a hot topic.We propose a virtual node scheme to enhance the mean session time of the network by generating a few virtual nodes in the stable real ones.Simulations show that virtual node scheme can decrease the lookup delay time and provide a reference to select the proper value for two parameters of the scheme in terms of efficiency and cost by PVC framework.