2014 Vol. 39, No. 4
Objective As the age of big data is coming,the social,science and economic have been undergoing agreat revolution.At present the location big data mainly produced by ubiquitous mapping is an impor-tant part of the big data research.Ubiquitous mapping is to analyze the relation of human,includingthe individual and the group,to the natural and social environments.Location big data has become theimportant strategic resource for analyzing people’s behavior patterns and geographic situation,andbuilding smart city.Through the location big data processing and analyzing,the positioning data canbe simply extended to the relationship of human social attributes and environments.This has greatlypromoted the relations among computer,data and mapping technologies,and formed the ubiquitousmapping calculation with intelligence and socialization.This paper introduces the location big datamainly from three aspects:Firstly,the classification,feature,effect and significance of location bigdata;secondly,the connection applications of location big data in social awareness,swarm intelligencesystem development and geographic situation analysis;lastly,the common processing methods of lo-cation big data,including the maps and trajectory data preprocessing,dimension reduction analysisand collaborative mining.
Objective High-resolution Earth observation system is not only a critical infrastructure related to na-tional security,economic construction and social development,but also promotes the transformationof China’s economic development with it great market value.In accordance with the requirement pro-posed in the Third Plenary Session of 18th Central Committee that"make the market play a decisiverole in allocating resources",development of China’s high-resolution earth observation system musttake the road of commercial operation.Firstly,main problems of development of China’s high resolu-tion earth observation systems are analyzed briefly;Secondly,the necessity and feasibility of commer-cial operation of high-resolution earth observation satellite system is proposed;Specific suggestionsabout development of commercial high-resolution Earth observation system are presented finally.
Objective Crowd sourcing geographic data(CSGD)is a new kind of open geospatial data collected andprovided to citizens or organizations by non-professional volunteers.Its acquisition differs from con-ventional methods of surveying and mapping.It has the characteristics of being up-to-date,informa-tive,low cost,and large size.Unique issues of such data are its heterogeneous data quality,redun-dancy and incompleteness,uneven data coverage,absence of data standards,privacy and safety,etc.It can be employed in a wide range of application fields including emergency cartography,early warn-ing,map and database updating,crime analysis,and epidemiologic studies.Thus,it is important tosystematically investigate the key aspects regarding its processing and analysis.The paper reviews theprogress and challenges of CSGD processing and analysis.Firstly,the paper summarizes the conceptand characteristics of CSGD.Secondly,the paper introduces the acquisition methods and potential da-ta sources of CSGD.Thirdly,the paper discusses some key methods and techniques on the processingand analysis of CSGD,which includes data quality metrics,information extraction and updating,spa-tial data analysis and mining.Specifically,the paper suggests that firstly data evaluation is a criticalprerequisite for CSGD research and application,secondly information extraction and updating via CS-GD can complement the conventional methods of geographic database updating,and finally spatial dataanalysis and mining on CSGD can provide valuable information and knowledge for potential applica-tions using methods like network topological analysis,spatial statistical analysis,and spatial datamining.Lastly,the paper draws some conclusions on the state-of-the-art of CSGD processing and a-nalysis and points out the future research directions.
Objective In this paper,the membership function in fuzzy logic is applied to image segmentation.It u-ses the value of a membership function within a certain range instead of a segmentation threshold todetermine which category apixel belongs to.This method is called the fuzzy logic image segmentationmethod(FLIS).Through image segmentation experiments with real aerial images,we compare exper-imental results with existing methods.The results show that the FLIS method has clear advantages.
Objective The assumption that the spectral responses of different types of objects in different periodshave the same linear relationship in traditional relative radiometric normalization is insufficient for theanalysis of high resolution remote sensing images.Object-oriented relative radiometric normalizationfor high resolution remote sensing image change detection is proposed in the paper based on the as-sumption that the spectral responses of different types of objects in different periods have different lin-ear relationships.Firstly,image objects are divided into two categories:changed and unchanged bycorrelation coefficients.Secondly,gains and offset parameters are calculated by an analysis of a ran-dom sampling consensus based on unchanged image objects.Thirdly,gains and offset parameters ofthe unchanged image objects which are most similar with the changed image objects are assigned to thechanged image objects.Lastly,the image objects are corrected using gain and offset parameters.Ex-periments on high resolution remote sensing images verify the effectiveness of the proposed method.
Objective The segmentation of the remote sensing image,which has important meaning,is the main step ofremote sensing image processing.There are a lot of researches focusing on this issue and someone has alreadyproposed some common methods.These methods have their advantage and fields of application,but most ofthem emphasize computer calculation too much.We translate the idea of image segmentation based on cellularautomata into the GIS platform as a new method to handle remote sensing image segmentation,implementingthe Remote Sensing Image Segmentation Algorithm based on Cellular Automata.This seg mentation methodblends human judgment in the process of segmentation and is very flexible.
Objective Linear array panoramic cameras have enabled the acquisition of 360°panoramic scenes withlinear CCD turning.It has used fewer camera stations and avoided image mosaicing in close-range pho-togrammetry.We developed a sensor and adjustment model function for linear array panoramic camer-as.We demonstrate the models for simulated data and indoor panoramic 3Dcontrol field data.Theseexperiments show that the parameters of model are logica land that these parameters accurately de-scribe the relationship of the internal structure in the linear array panoramic camera.The model is apractical calibration model for linear array panoramic cameras.method by using the arithmetic of C5.0decision tree.The new method was putted in practiced in re-mote sensing images with high resolution of GuangBa farm DongFang city,HaiNan Province.The re-sults showed that the producer’s accuracy,user’s accuracy and total accuracy of rubber woods is are81.00%,82.65%,and 83.50%respectively,and the kappa coefficient is 0.78.The results that com-paring with other classification methods indicated the method is valid for rubber woods identification.
Objective In order to accurately and quickly extract the information of rubber woods,a new informa-tion extraction method of rubber woods distribution was designed based on textural features and multi-spectrum features of remote sensing images with high resolution,and the detail process as follow:firstly,choose suitable vegetable index;secondly,acquire the best texture extraction window throughsemi-variance statistical analysis of the images of vegetable index and multi-spectrum and extract tex-ture information of remote sensing images;at last,build new classification rules based on texture in-formation and spectrum information of remote sensing image with high resolution and realize the newmethod by using the arithmetic of C5.0decision tree.The new method was putted in practiced in re-mote sensing images with high resolution of GuangBa farm DongFang city,HaiNan Province.The re-sults showed that the producer’s accuracy,user’s accuracy and total accuracy of rubber woods is are81.00%,82.65%,and 83.50%respectively,and the kappa coefficient is 0.78.The results that com-paring with other classification methods indicated the method is valid for rubber woods identification.
Objective A new adaptive filter based on empirical mode decomposition based on different characteris-tics of signal with noise in different IMFS for suppressing speckle in SAR interferograms is proposed.At first,empirical mode decomposition is used to divide signal and processed high-frequency IMF sig-nals separately by adaptive filtering The denoising effect of the proposed method,the usual filter anda multiscale BEMD filter were investigated by experiment.When the part related to the speckle issubtracted from the original interferogram,speckle noise is reduced.The results are compared withthe four other methods of mean filtering,median filtering,BEMD decomposition method and the ordi-nary adaptive filtering,showing that the BEMD-adaptive filter method is a powerful means for inter-ferogram speckle noise reduction,and can preserve fine details in the interferogram directly related tothe ground topography as well as maintain phase values distribution.
Objective Considering the accuracy of gravity matching is influenced by heading and gravity field,thestatistical characteristics of the gravity field are calculated by a moving window to select a gravitymatching area,and the correlation between the accuracy of gravity matching and carrier heading isconfirmed.Then,the skeleton points of matching areas are extracted by a fast Euclidean distance fieldalgorithm and a simplification algorithm to form local areas with their distance values,so by calculat-ing the gravity statistical characteristics of these local areas,the carrier can select headings with highgravity matching accuracy.Finally the validity of this analysis method is verified with simulation re-sults.
Objective It is great significance for construction planning,seismic fortification and sustainable devel-opment to determine the fault parameters of Shenzhen.The parameters of faults under two surveylines in Shenzhen are inverted by a human-computer interaction method in this paper.The resultsshow that:firstly The relative errors between a forward model and observations of two survey linesare 4.51% and 4.26%,and the computed results are reliable;secondly The range of faults is aboutfrom 4km to 17km and normal faults with 70degree dip angles approximately;finally The locations offaults may be correlated with the topography in Shenzhen.The results are in accordance with othergeological,exploratory and geophysical data.Our results show that profile gravimetry data is feasiblefor inverting fault parameters in Shenzhen.
Objective The paper analyzes the chaotic properties of ionospheric total electron content based on the600TEC data from 101～150din 2008over 120°E and 45°N.Calculation results show that the TECtime series is chaotic when correlation dimension is 2.263 2,embedding is 5and largest Lyapunov ex-ponent is 0.083 3.A new method to choose the more similar phase points is advanced based on cosineand cluster analysis while the weighted one-rank local-region forecasting method is used sucessfuly toforecast the time series of TEC.The search results show that the paper’s method can hunt more simi-lar phase points than the euclidean distance and cosine in 5dimensional space.Forecasting resultsshow that the STD(0.618TECU)and RMS(0.623TECU)based on cosine and cluster analysis areless than the other two methods.Therefore,the method can choose by rule and line similar phasepoints and improve forecasting precision.
Objective Precise orbit determination of medium and high earth orbital satellites mainly depends onground-based tracking systems,which have limitations such as insufficient tracking arc,low accura-cy,and are often unable to support autonomous navigation.The foundation of a GNSS crosslink net-work provides a new resolution for orbit determination of medium and high earth orbital satellites.This paper introduces a new method,used for orbit determination of medium and high earth orbitalsatellites based on GNSS crosslink ranging observations.The visibility,positioning accuracy,andcrosslink budget has been analyzed and the simulative calculation of Autonomous Orbit Determination(AOD)was performed.The results show that with sufficient visible satellite numbers,positioning ac-curacy,and link margins,the AOD of medium-high earth orbital satellites using GNSS crosslink ran-ging observations can achieve a positioning error within 6min radial,2malong and across;the newmethod is feasible and can open new directions in the application of GNSS.
Objective Considering that is difficult to repair cycle-slips on each frequency in a triple-frequency ge-ometry-free phase combination a new method based on the linear combination of GNSS triple-frequen-cy un-differenced observations has been proposed to detect and repair cycle-slips.According to appro-priate principles,triple-frequency pseudorange observations and phase observations are respectivelyused to form different effective geometry-free linear combinations.The threshold of each combinationfor cycle-slip detection is deduced.The performance of this new method is analyzed systematically todetect big jumps,small jumps and special jumps,respectively.After rounding the big cycle-slips andrepairing observations,a searching method was used to repair small cycle-slips.At last,GPS triple-frequency data were used to verify the correctness of the algorithm.The results show that the pro-posed algorithm can simply and instantaneously detect and repair all possible cycle-slip combinations atevery frequency.
Objective In this article,we derive a spherical strain Kriging formula based on the basic theory of Kriging,and applied it to simulated and real GPS data.We analyzed its difference with the least-square collocationmethod.Crosscheck results indicate that Kriging interpolation is feasible and valid in GPS velocity smoothingand gridding.The Kriging strain results reveal low robustness and obvious edge effects,but the smoothed andgridded results for GPS velocity data during 1999-2004from Kriging interpolation methods are in agreementwith the results calculated by the Least-square collocation method.The strain rate results from the two meth-ods are similar in the whole distribution characteristic,however,the kriging results show low self-consistency.In a word,the Kriging strain method is not as good as the least-square collocation method for robustness andedge effect.
Objective Errors in coefficient matrix and observation vectors can be reasonably taken into account inmixed LS-TLS.However,in such a solution,gross errors in the observation data are ignored.Tosolve this problem,a robust method of mixed LS-TLS is proposed based on the IGGII scheme.Acomparative evaluation with least squares,total least square,s and mixed LS-TLS was conducted forplane fitting with a set of simulations and a set of real-life data.The results confirm that the proposedmethod is more reliable than the others.
Objective Wavelet decomposition scale is directly related to the effect of de-noising.To de-noising ofdeformation sequence,the authorcombines the information criterion of the time series analysis model-ing pricing and the characteristics of Gauss white noise under the wavelet transform,put forward themethod thatthe Akaike information criterion was used as quantitativeindex to determine the optimaldecomposition scale.The calculation results of examples show that in the Akaike information criterioncalculated value to the Minimum,the decomposition scale determinedconform to the distribution of sig-nal-to-noise,de-noising effect is better,and Akaike information criterion was used as quantitative in-dex to determine the optimal decomposition scale is effective,itimprove the convenience of wavelet de-noising in deformation data processing.
Objective In recent years,although data provide more information for monitoring objectives with the gradual474 increase of the sampling frequency,nevertheless preprocessing creates higher requirements.In order to solvethese problems,this paper proposes an improved hybrid filter method based on wavelet packet analysis thatcombines a high-frequency coefficients weighted median filter and the overall weighted mean filter used to pre-process the data.Through theoretical analysis and practical application,this method is shown to be simple,fast and has a maximum de-noising effect of random noise,non-random noise,gross error and other uselessinformation,while highlighting the trend,periodic and other useful information for monitoring objectives.This improved method has better de-noising effect than traditional methods and is particularly suitable forhigh-rate deformation monitoring data and a practical method for data processing.
Objective By verifying the accuracy and consistency of spatial data,quality evaluation becomes theprecondition and sticking point for updating systems.In an incremental updating process,onlychanged objects can be updated,so the quality evaluation starts with the neighborhood of a changedobject.First,the geometry similarity of updated object between its corresponding source object is ex-pressed and computed.Second,the spatial relation similarity of neighborhood of an updated object be-tween its corresponding regions is calculated.Third,the quality evaluation sequence is ascertainedbased on the principle of maximizing the reliability of quality evaluation and a two-stage evaluationmethod is designed based on the characteristic of adjacent objects.Tests illustrate that the method caneffectively and practically detect the mistakes in updated objects and the relationships in updatingprocesses while confining the quality evaluation region in the neighborhood of an updated object.
Objective Massive spatial data are currently stored with textual locality descriptions but not geograph-ic coordinates.In geographic information systems,the lack of geographic coordinates makes these datadifficult to query and analyze.Georeferencing of textual locality descriptions makes these data morevaluable by allowing them to be used in spatial analyses.Existing methods do not analyze spatial con-straints between reference points in linear locality descriptions.This paper proposes a new method forgeoreferencing linear locality descriptions.On the basis of traditional method,local extent directionwas employed to compute a probability distribution of linear locality at each reference point.Moreo-ver,fuzzy visibility concept was introduced to compute vague distributions of linear locality amongreference points.Experimental results show that the proposed method is useful for georeferencing lin-ear locality descriptions.
Objective The analysis and discovery of spatial association is a hot issue in the field of spatial data min-ing.However,a little attention has been paid to the spatial association of network spatial phenomena.The objective of this article is to demonstrate the application of the Q statistic,developed for the anal-ysis of the spatial association of qualitative variables,to the detection of spatial association of the net-work spatial phenomena.This paper introduces the Q statistic concept,and defines distance measureby the shortest path.The spatial association of the hotels in Shenzhen city in China are analyzed usingthe Q statistic.A comparative analysis of the Q statistic in Euclidean space and network space wasconducted.The results show the agglomeration effects among hotels from a spatial data mining per-spective.Results also show that the Q statistic is valid for spatial association analysis of network spa-tial phenomena.
Objective In this paper,firstly,the deficiency of traditional large-scale source submergence analysis isintroduced.Secondly,considering the application in the 3Dvirtual globe and based on the global ter-rain data organization method,a terrain tile–based algorithm for source submergence analysis is pro-posed,and the large-scale source submergence analysis is achieved in the 3Dvirtual globe.Finally,according to the algorithm,a GeoGlobe-based experiment is tested and some advices are discussed.
Objective It takes the profile of 3Dgeological modeling as principle for intersected modeling between tunneland strata,the paper presents a intersected modeling method between tunnel and strata.Taking the complexi-ty into account,intersected model is subdivided.It constructs the model of strata profiles within tunnel sec-tion,the intersected model among tunnel and strata profiles,and the intersected model between tunnel andstrata respectively.It is shown that the method of intersected modeling is feasible,ensures the spatial topolo-gical relations,and provides the reference for the relevant 3Dgeological modeling.
Objective This paper pays attention to the measurement of geometric information content of individuallinear features and proposes a method based on the bends of a line.Firstly the method of identifyingthe bends of a linear feature based on inflection points is presented,and the geometric structure of aline is divided into a set of bends.Then the information content of an individual line feature based onits ordered spatial set is measured from three levels,namely,element level,local level and global lev-el,which correspondingly measured based on bends shape,bends topology and bends distribution,re-spectively.These three types of information form the entire geometric information of an individual linefeature.Moreover,the methods of measuring each type of geometric information are also presented.At last,some practical examples are provided to illustrate the proposed methods in this paper.