2009 Vol. 34, No. 12
With the development of surveying and measuring technology,landslide monitoring is developing to automatic,three-dimensional,and real-time.The monitoring data processing is more and more important.We study the new methods for landslide deformation analysis and prediction,such as lifting wavelets,fuzzy clustering,fuzzy neural network,and its applications,form a comparatively integrate idea,and provide new research direction for landslide deformation analysis and prediction.
In the positioning and altitude determination system,the tight GPS/INS coupling and loose GPS/INS coupling both have advantages and disadvantages.The loose coupling has good reliabilities but low precision while the tight coupling has high precision but poor reliabilities.Hence,it is desirable to design a Kalman filter,which offers not only high precision but also good reliabilities.Based on the consistent INS error state equation,a multiplex filter is proposed with two different observation equations of loose coupling and tight coupling.In the multiplex filter,the different solutions can dynamically be adopted according to actual GPS observations.in order to ensure both the accuracies and reliabilities.To verify the effects of the multiplex filter,actual data is processed respectively with three coupling modes.From the analysis and comparison of the processing results,we can found that there are obvious advantages of dynamic coupling using the multiplex Kalman filter both on the accuracies and the reliabilities,comparing with both loose coupling and tight coupling.
We firstly discuss the constellation rotation errors of autonomous orbit determination of navigation satellites,which only make use of crosslink ranging observation.Then,we comprehensively analyze the influence of tide perturbation on right ascension of the ascending node and autonomous orbit determination.The processing results of 110 days' simulation data indicate that tide perturbation has a great impact on the errors of along-track and cross of autonomous orbit determination.Moreover,the impact will become bigger with the increase of days of the autonomous orbit determination.The difference of along-track and cross of autonomous orbit determination on the 110 days because of to tide perturbation is about 4 meters.Therefore,the necessity that consider the influence of tidal perturbation in high-precision autonomous orbit determination is proposed.
We study the theory of free wobble of the triaxial Earth and find that the Euler period should be actually expressed by the complete elliptic integral of first kind and the trace of the free polar motion is elliptic and the orientations of its semi-minor and major axes are approximately parallel to the Earth's principal axes A and B, respectively. Due to the triaxiality of the Earth, there is a mechanism of the frequency-amplitude modulation in the Chandler wobble which might be a candidate to explain the correlation between the amplitude and the period of the Chandler wobble. We compared the theoretical polar parameters (m1,m2) with the observed values for the Chandler components obtained from the data EOP (IERS), and find that they coincide with each other quite well, especially in recent years. Moreover, a polar wander towards 77.98°W at the rate 3.93 mas/a was also obtained from the IERS data.
The improved united ambiguity decorrelation is proposed by analyzing the deficiency of united ambiguity decorrelation algorithms.The improved united ambiguity decorrelation needs search the most precise ambiguity to decorrelate before every decorrelation process and the conditional variances should be distributed in descending order approximately.The performances of the united ambiguity decorrelation algorithm and the improved united ambiguity decorrelation algorithm are compared by using 6 symmetric and positive definite matrixes derived from a random simulation for high dimension and low dimension,respectively.The results show the improved united ambiguity decorrelation algorithm is able to decorrelate the correlation of variance-covariance and has better performances on decreasing the time of decorrelation process and search process.
Correlated errors exist in GRACE coefficients were analyzed thoroughly based on GRACE level-2 RL04 version time variabel gravity data.The strategy of correlated error filter was determined and its effect was evaluated quantitatively by combining JASON-1 altimeter data and WOA05 ocean model.The comparison shows that the filter reduces the correlated errors effectly and greatly improves GRACE's capacity in recovering the short wavelength components of earth surface mass variations.
Focused on the mission of remote sensing geophysics parameters with GPS-reflection receiver set on LEO satellite,the reflection events are simulated taking into account the scan mode of satellite and receiver antenna parameters based on simplified general perturbation-4(SGP4) orbit prediction model.The impacts of satellite orbit parameters,orbit inclination,orbit height,argument of perigee and right ascension of ascending on the distribution and number of ocean reflection events are discussed through simulation.Moreover,the total reflection events including ocean and land reflection derived from single LEO satellite is simulated and analyzed.The results can be referenced when designing the LEO satellite orbit parameters and receiver antenna parameters.
To satisfy the requirement of precise positioning and tracking for flying vehicle(FV) in times of exterior location datums(like GPS) outages or in these datums denied sea areas,a cooperative positioning technique(CPT) for the FV is proposed.First,the time of arrival(TOA)-based cooperative positioning algorithm(CPA) is presented after building station Cartesian coordinate system for cooperative positioning.Second,time synchronization among the nodes is crucial to guarantee CPT.Taking a single-hop S2WSN as an example,the problem of low synchronization precision,which is caused by ignoring unequal reply time in conventional round-trip timing(RTT) method in joint tactical information distribution system(JTIDS),is resolved by two-way timing with unequal reply time(TWT-UTD).Finally,a conceptual numerical simulation system is designed to verify the effectiveness of the proposed methods by Monte Carlo simulation over 1 000 runs.The simulation results show that the absolute bias of the FV tracking by the proposed CPA is superior to that of the conventional single ship-based relative positioning method through optimizing the position dilution of precision among the sea-surface nodes and the FV.Meanwhile,compared with the RTT method in JTIDS,the synchronization precision is increased by more than 20% via TWT-UTD method whilst the equivalent ranging error being reduced by 3 m.
A single point's displacement is chosen to analyze all the factors' impacts on a dam's displacement.We propose a method of survival analysis for reference to find the dam displacement law as a whole,by which way all the monitoring-points whose displacement is affected by many factors are considered.Then the law can be found.
Based on the theory of potential and flexural isostatic compensation,the relationship between ocean depth and the vertical component of gravity gradient anomalies is derived.Using the vertical component of gravity gradient anomalies computed with Geosat,ERS-1/2,T/P,Jason-1,EnviSat-1 altimeter data,the ocean depth model of the South China Sea is predicted using a FFT method.Then the predicted ocean depth model is compared with LDEO ocean depth.The RMS of the difference between the predicted model and LDEO achieves 591.7 m when ocean depth is greater than 5 000 m.
It is difficult to register LiDAR data and photogrammetric image automatically with traditional methods because of their different imaging manners.We presented a novel method based on a 6-tuples relaxation.First,the intersection of two edges perpendicular to each others is defined as building corner.Then,the rule that triangle consist of three corners with same height is similar with its projection is used to match corners with the 6-tuples relaxation.Finally projection distortion is used to optimize matching.Experimental results prove that the proposed method registers LiDAR data and photogrammetric image automatically and precisely without interior or exterior orientations,and immune to affines and rotations.
We propose an efficient algorithm for approximate epipolar rearrangement of satellite stereo-imagery by linear simplification of PT,according to geometric properties of epipolar curves in the scope of satellite image scene.In order to verify the correctness and feasibilities of the algorithm,experiments were performed on three kinds of satellite stereo-imageries with various terrains,respectively.Results show that on the rearranged approximate epipolar images,the vertical-parallax of each pair of conjugate points all keep at a sub-pixel level.Moreover,the algorithm is also featured by easy applicability and suitability to linear pushbroom satellite stereo-images.This algorithm has been successfully applied to the 3D data revision and automated DEM generation software modules of satellite linear pushbroom images.
Traditional linear models for generation SPOT natural-color is difficult to produce highly realistic natural-color SPOT image because of its simplified description of the relationship between the image bands.To overcome this problem,we propose a new SPOT natural-color simulation method based on spectrum analysis.Firstly,we take the image's spectrum as a reference to adjust spectrum library's spectrum scales and form some spectrum bands.Then,we use the back-propagation artificial neural network(BP ANN) to study and analyze the relationship between spectrum-library's blue band and other bands.Finally,we take this relationship into the simulation of SPOT blue band and generate the natural-color images.Experiment show that the natural-color image is much more excellent in both visual effects and spectrum information than others.
Vegetation index data has been widely used in kinds of research regarding global environmental change.Although EOS provides some kinds of VI data derived from AQUA/MODIS,residual noise still exists for the reason of cloud contamination,atmospheric variability,and some effect.Hence,the VI data is always discontinuity both in space and time,which is the main reason of the following errors.We reconstruct the MOD13A2-EVI time-series data from the year of 2001 to 2007 based on savitzky-golay filter using IDL.The reconstructed EVI data reduces the effect of cloud and some abnormal noises,makes the data more unanimous in space and more steady between years.
According to the SVM computation theory and the features of hyperspectral remote sensing(RS) image data,the optimal hyperplane between two classes is computed by the nearest points algorithm(NPA).Reasonable weight indicators are designed for each class and a new weighted "1 V m" SVM based on NPA is proposed to achieve Hyperspectral RS image classification.The new algorithm can reduce the computational complexity and calculation of SVM,and improve SVM feasibilities and efficiencies for hyperspectral RS image classification.Finally,a test was carried out on OMIS image and good results are obtained.
Considering the problems of traffic sign automatic detection in natural scene images,an automatic algorithm for traffic signs detection based on a vehicle-borne mobile mapping system is proposed.The proposed algorithm synthesizes such techniques as self-adaptive image segmentation,gray-value projection,shape analysis,and stereo image matching to realize automatic detecting traffic signs and computing their geometric information such as spatial positions and sizes.The new algorithm is applied to actual images of natural scenes taken by the mobile photogrammetry system in Nanjing at different times.The experimental results show that the algorithm is robust and has higher detection speed and detection accuracy.
The challenge of remote sensing information service is complex in user demand,rich in data dimension,diversified in sensor types,complicated in processing and time-varying in network,which makes intricate in semantics.Therefore,we propose a hierarchical semantic constraint model as a uniform semantics description model with four levels including user semantic constraints,data semantic constraints,process service function semantic constraints,and process service quality semantic constraints.These constraints act role of establishing the connection between user semantics and data services and processing services,and of the basic of semantic reasoning in service discovery,selection,and composition.
DEM(digital elevation model) and terrain analysis based on DEM are scale-dependant.DEM scale transition that deducting other more scales from a fixed resolution is always met in application such as hydrology,soil science,and geomorphology.We present a new DEM scale transition with point spread function which is of image blur.The presented method is carried out by the spatial convolution between DEM with fixed resolution and simulated point spread functions with different radiuses.Then elevation statistics,contour-matching,and autocorrelation are designed to evaluate the transformed DEM.Finally the results achived with the proposed method were compared with the results of nearest neighbor assignment,bilinear interpolation and cubic convolution in the two different study areas.The result shows that the proposed method is effective.
Entropy is a key index on measuring DEM terrain information content.According to the uncertainty of the influence of subset partition strategies on DEM information content,we present a new algorithm of subset partition based on the maximum entropy theory.Firstly a standard maximum entropy response curve is constructed.Then the optimal classification strategy is analyzed and calculated based on the differences between standard logarithm model and original information entropy model through the statistic characteristics of the linear slope vary and the stability of the curve.The new algorithm is proved to be efficient to avoid the subjectivity and arbitrariness in DEM artificial classifications,which provides an objective theoretical basis for DEM information entropy calculation.
The global error of digital elevation model(DEM) comprises the propagation error of grid data and the simulation error of terrain represented by interpolation method.The experimental results show that the differences of different DEMs' global errors reduce gradually with the original data error increased in same terrains,in the end to zero.In different terrains,the DEM's global error is different when the original data error is different even if the interpolation method is the identical.As shown in experiment,the precision of digital elevation model not only relates to sampling intervals and terrain complexity but also relates to interpolation methods and original data errors.Different DEMs have different sensitivity to original data error in different terrains.Hence,we present the way to select appropriate interpolation method.
Recent advances in open geospatial web services,such as Web map services,have led to the generation of large amounts of OGC enabled links on the Internet.How to find the correct spatial aware web services in a heterogeneous distributed environment with some special criteria,such as coincidence of type,version,time,space and scale,has become a bottleneck of geospatial web-based applications.Meanwhile,the interoperability of OGC is only syntactic but not semantic.In order to improve the accessing precision of OGC Web map services(WMS) on WWW and make it semantic,a new methodology for retrieving WMS based on extended search engine and services capability match is proposed.Major components include the WMS search engine,WMS ontology generator,WMS catalogue services,and multi protocol WMS client.The process of WMS link detection,capability matching,ontology modeling,and automatic registry are reported.Then the precision and response time of WMS retrieval is evaluated.Results show that the execution time of proposed method equals to that of traditional method.Moreover,the precision is about 10 times than that of traditional methods.WMS ontology record could be generated.
The traditional static guidance algorithm can't suit the dynamic situation.Genetic algorithm(GA) and geographical information system(GIS) were adopted to seek the vehicle driving route guidance algorithm based on the real-time traffic information.A dynamic route guidance algorithm based on GA and GIS is proposed on the basis that the time-dependent dynamic traffic network was built.In order to realize the algorithm,the special function of fitness,the arithmetic operators of selection,crossover and mutation,which accord with the characteristics of dynamic traffic network,were designed.The high efficiency of this algorithm was proved by an experiment.
Spatial heterogeneity widely exists in the nature.Traditional spatial association data mining assumes that the area on which the mining algorithm performs is evenly distributed,which leads to the mismatch between the mined knowledge and the reality.We suggest that spatial association mining should consider this spatial heterogeneity when designing mining algorithms.The characteristics of spatial association mining were analyzed.Three key measuring indexes indicating spatial association strength were defined.The method of calculating the indexes was presented.The algorithm for mining spatially heterogeneous association patterns and their corresponding subregions in which the pattern shows strong association was proposed.Practical application proved that the proposed strategy was valuable and effective in mining spatial association patterns under spatially heterogeneous environment.
On the basis of analyzing the now-generally-used spatial association rules algorithm,aiming at the shortage of the very large database spatial association rules mining,a spatial association rules mining algorithm based on immune algorithms is proposed.This algorithm makes use of the immune recognition mechanism,immune memory characters and clonal selection characters.In the process of spatial association mining,spatial association rules are regarded as the antigens,candidate itemsets are looked upon as the antibodies.The spatial association rules are stored in memory,and speed of mining spatial association rules is accelerated.We take the incidence relation of special data of pole and tower fault as an example,to verify the algorithm.Experiment results show that the proposed algorithm is effective.The algorithm is able to be more quickly and efficiently search in the whole global,and extremely be used for the mining spatial association rules to very large database.
Given two adjacent oblique axial parabolas,we propose a method to construct a Hermite curve from the tangent vector of the vertex.This method avoids extracting.Moreover,addition,substituting the direction determined by ration points for the above mentioned direction is a viable way to improve performance.The magnitudes of tangent vectors can be set to the projected length of chord in the approximate method.While gain similar shape the two algorithms are more efficient than weighted average interpolation.
According to the structural characteristics of the curve,a new polygonal approximation algorithm based on corner detection is proposed by introducing a polarization cornerity index for the corner candidate.This algorithm has nothing to do with the position of the start point.A comprehensive analysis between the polygon approximation approaches and the corner detection approaches for the reconstruction of the curve is executed.The newly developed curve reconstruction algorithm was extensively tested on various shapes and is proved to be computationally fast and robust to noise.Experimental results are stable and closer to human visual effects.
A hybrid genetic algorithm combined with greedy algorithm and its various crossover operator are applied to the four-coloring map problem.The influences of the four kinds of possible crossover operators,this algorithm are analyzed and compared.The results show that the crossover with edge recombination has a best perfermance.
The advantage of EDA is to display the simple and inherent characteristics and discipline inside data from more than one view by human-computer interactions to link various visualization analysis methods without any hypothesis as the premise.We aim of exploreing the potential capabilities of EDA used in landuse data analysis.EDA is used to analyze distribution characteristics of landuse,associate relationship among different landuse factors,landuse change and other landuse-related aspects.The results of the tests prove that EDA is capable of exploring the deep landuse information contained inside landuse data effectively from different levels and different views.These landuse information covers many aspects from whole to individual view,from detail to generalized classification,from general to specific characteristics,from static to dynamic characteristics.