1999 Vol. 24, No. 2
For a special use a new modelling method of evaluating external disturbing potential is presented in this paper. Being different from classical methods in physical geodesy this method is grounded upon the theory of unified representation of gravitational field. The models created in this way are particularly suitable for a high speed computation of gravitatioal field in low altitude because they take account of topographic effects and have their kernel functions with simple structure and weak singularity.
In the paper the authors have studied the influence of the geodetic singularity on geodesy and the structure of the gravity field near the singular point, exposed the nature of the singularity, and drawn the following conclusions reference Isolated geodetic singular points are removable; reference Under the condition that Gauss curvature of equipotential surface is non negative and geodetic torsion of meridian vanishes. If there exist geodetic singular points on the equipotential surface of the earth's gravity field, their number is finite and they are isolated.
According to the inhomogeneity of altimeter data distribution over ocean, the authors propose two methods for weighting in altimeter data processing, one for crossover points and the other for non-crossover points. It shows that the suggested methods are better than the former methods proposed by other authors.
The national high-precision GPS network for geodetic control is surveyed from 1991 to 1997. The GPS observation data are processed by GAMIT software. GAMIT baseline solution are used as observations for integrate adjustment. In this case the outlier baseline is very difficult to find out from 4 935 independent baselines. This paper uses correlation analysis theory to snoop the outliers. In the adjustment the outliers are kept but down weighted. This theory can find out multi-outliers in correlative observations, and reduce the influence of the outliers. This paper also gives the step and principle for this outlier analysis theory. And gives the comparison of the result of the national high-precision GPS network with outlier analysis.
The Kalman filtering is a method with which the raw data with noise can be cleaned. The standard KF can be extended to a non-linear model. In the extended Kalman filter (EKF), Taylor proximate formula has been applied to both state equations and measurement equations, in order to estimate linearized dynamical models. But if the initial value is incorrect or the noise is very strong, the linearized models may not be good anymore. The iterated extended Kalman filter (IEKF) therefore has been applied to GPS raw data processing, and the results are satisfactory.
In this paper, the author raise a methodology determining automatically the observation weights in GPS-supported bundle block adjustment. During iterative computation, the observation weights can be corrected automatically by using the posterior variance estimation based on a fast recursive algorithm for repeated computation of the reliability matrix Q vv P . In any case, the estimator of weights of additional parameters' observations and drift error parameters' observations are comparatively exact. However, the weights of ground control point observations, GPS camera position's observations and GPS offset observations can not be estimated exactly. The experiment has shown that a stable and reliable adjustment results are ensured even if the author selected wholly different initial weights of the observations, and the additional computation amount is acceptable.
It is necessary to set up a control net with extra-accuracy around the industrial objects. In this paper the theory and practice for setting up this kind of control net are presented while the length accuracy of precise standard rule and angle accuracy of precise theodolite are simultaneously used. The method could be used for accuracy test for a group of GPS receivers.
This paper provides a systematic description to the accuracy of airborne laser-ranging multispectral-imaging mapping system (ALRIMS). This integrated system driven by the availability of global positioning system (GPS), compact ruggedized solid state lasers, high-precision airborne inertial navigation systems and rugged precise high speed multispectral scanner is developed for capturing topographic and multispectral information of the earth surface in the form of georeferenced multispectral image with digital elevation model (DEM). The DEM and georeferenced multispectral imagery has matched accurately when they are captured. This is essential for many remote sensing purposes, such as geometric and radiometric rectification of the multispectral image, the utilization of auxiliary information for multispectral image classification, etc. This is reached by using laser-ranging multispectral-imaging coupled scanner (LRMICS) developed by our team in IRSA.
The state bureau of surveying mapping of China has planned to speed up its development of spatial data infrastructure (SDI) in the coming few years. This SDI consists of four types of digital products, i.e., digital orthophoto, digital elevation models, digital line graphs and digital raster graphs. For the DEM, a scheme for the database building and updating of 1:10 000 digital elevation models has been proposed and some experimental tests have also been accomplished. This paper describes the theoretical and/or technical background and reports some of the experimental results to support the scheme. Various aspects of the scheme are discussed such as accuracy, data sources, data sampling, spatial resolution, terrain modeling, data organization, etc.
According to the position in the data flowing graph in GIS, geographic data are classified into the following kinds:first data, database data, middle data and final data. An index for measuring data quality is added to the data "atom" in GIS. The geographic data quality model at different step in GIS is proposed. The analysis of the models indicates that the data quality of first data plays the most important role in GIS products. The factors effecting the data quality in GIS and three quality control ways are generalized.
In this paper the idea and algorithm for solving the problem of automatic location of the best view point were introduced. Concerned with the complexity of actual applications, the authors suggested extracting knowledge from GIS to assist the solution of more complicated problems. According to the case study of the television tower positioning, it is proved that the GIS based positioning model proposed in this paper is correct and efficient for most cases.
A practical numerical algorithm for calculating the errors of any point on a curve in GIS is introduced. The main approaches of error-band determination of arbitrary curves are given. As an example, the error-band model of a cubic spline is calculated.
In geodesy many models are nonlinear ones. The classical method dealing with these nonlinear models is linear approximation using the approximate value of parameter. Due to the difference of nonlinearity between different nonlinear models, some nonlinear models can be linear approximation and the others can not be. To judge a nonlinear model can or cannot be linearized, reference has presented a criterion. Based on reference this paper defines interinsic nonlinearity and parameter effects nontinerarity of a nonlinear model at first. Then according to the definition a new practical criterion has been presented. The new practical criterion not only has the characteristic of the criterion in reference, but also overcomes the defect of the criterion in reference. Compared with the criterion in reference the new one has many new characteristics.
Automatic map generalization is a complex problem which includes much data analysis. And knowledge is needed in data models operations and their relations. Thus, it is necessary to analyze the method of decomposement and operators set of automatic map generalization for solving this complex problem. In this paper, with respect to the essential characteristic of objective of map generalization and its self-rules, the problem of map generalization is decomposed, and a complete operators set of map generalization is given, based on the analysis of existent operators sets of map generalization. The relationship and rank of operators are discussed.
Based on the analysis of difference between placing map name manually and automatically, the primary factors that should be taken into account while adding lable name automatically are discussed in this paper. Then, some principles for automated name placement and its implementation approach are presented.
This paper describes the design of the system based on single-scale map database. It is used for mutil-scale show and query. The map generalization is a big difficult theme, and is an unevadable problem of geographic information system mutil-scale show too. The author has studied the practical model particularly, and brought forward a new model design project.
Variable-scale map projection is a recent development as a field of research on map projection and has been widely applied. The paper studies and puts forward the theory and methods of composite projection, which is based on map projection in common use. A new kind of variable-scale projection is obtained. This method opens up new function and application range for common map projections. Systematical results and applications have been obtained.
This paper integrates cellular automata (CA) with grey situation decision transition rules to simulate land use conversions in the Hongshan District, a rural-urban fringe in Wuhan City. Then a satisfied result has been obtained. It reveals that this model can efficiently simulate land use conversions at both microscopic level and macroscopic level.
Based on the simple mechanical model of large dam deformation,the cusp catastrophe model of large dam destabilization is established by the catastrophe theory in this paper. The nonlinear strain and water-permeating weakening of the crack of the dam are taken into consideration. The mechanism of deformation and destabilization process of large dam as well as it's force condition are qualitively analyzed. The discussion presented in this paper may be helpful to wholely understanding the mechanism of large dam destabilization.
The thesis analyzes the data features of fuzzy relational database (FRDB) on the basis of fuzzy set theory, and gives the formulized description of fuzzy relation. The problems resulting from the extension of property domain from the binary logic to fuzzy set have been examined.The data integrity for FRDB has been specially studied.
The testing method of horizontal error of the beam in laser swinger is expounded in this paper, in which the horizontal error of beam in characteristic direction, its testing steps, and the method of data processing are presented. The testing result is given.
For the typical 3-D model of traffic-speed-density, the establishment of flow-density and flow-speed relation model from different speed-density relationship would be more understandable and acceptable, and safer and reliable. Thus, the model has contributed a better foundation for traffic analysis, control and management.