Table of Contents

The full guide to using Calc/Solve is here. The following information describes the process used at Hobart and is aimed at routine processing of post-correlation data.

Prepare level 2/3 databases

Level 2 databases have an updated clock model applied (using calc11), replacing the model used by the correlator. To generate new databases, edit the up11.inputs file with the new database names (note the leading $):

$14FEB05XT
$14FEB05ST

and then run up11 up11.inputs AU. After completion, you should have 14FEB05XT_V002 and 14FEB05ST_V002 in /data/vlbi/mark3_dbh

Level 3 databases have cable cal and MET data added to provide an apriori atmosphere model. This is usually only applied to the X-band data. First, download the station log files with wget http://lupus.gsfc.nasa.gov/ivs/ivsdata/aux/2014/aust19/aust19hb.log, etc. Then run pwxcb aust19hb.log. You will be prompted to confirm some details of the experiment, and optionally edit the recorded MET data. When running pwxcb for the first log of an experiment, you will be prompted to give name of the database that this experiment is linked to - use the same syntax as in the up11 file (e.g $14FEB05XT). Repeat pwxcb for all stations in the experiment and then apply the calibrations by running dbcal /data/vlbi/wxcb/aust19.dbcal. After completion, you should now have a 14FEB05XT_V003 file in /data/vlbi/mark3_dbh

Start calc/solve & get a viable solution

First start Calc/Solve with the command enter AU. Once inside the Calc/Solve system, be extremely careful with your typing as almost any keypress can and will be quietly interpreted to change various settings. Also, default settings may not always be displayed, or displayed correctly on start-up. A last warning - Calc/Solve always starts up with the same data & settings loaded as when it was last run. To restart an analysis you should re-load the data, overwriting any previous databases in memory.

Polar motion and UT1: Polynomial Parameterization         SETFL Ver. 2007.07.30
14/02/05 19:00 XWOB Coefficients   0 0 0 0
14/02/05 19:00 YWOB Coefficients   0 0 0 0
14/02/05 19:00 UT1  Coefficients   0 0 0 0
Select:(/)G.Rate & Segments (%)Only Segments (|)Sine Style (@)Reset Poly Epoch
Gamma, Precession rate             0 0    Nutation(.): Dpsi, Deps  0 0

Print residu(A)ls: OFF             Print corr. (M)atrix: OFF
Print (N)ormal Matrix: OFF         (Z)ero Normal Matrix: OFF

(^)Elev. cutoff: None              Pick parameters: (!)Sites OFF  (#)UT1/PM

Wea(K) Station Constraints: OFF    (R)Use rate: Yes

Use normally (W)eighted delays     Select: Baseline-(C)lock offsets

(:)Delay  Group  (;)Rates Off      Select: (B)aselines, (X)Data bases

Page: (E)Site       (S)ource       (O)ptions            (")Constraints
      (Q)Run least squares         (T)erminate SOLVE    (<)A priori clock
      (+)Change data type          Group delay only
      (')Change suppression method SUPMET__PRE98        (-) Singularity check

 Last page   Parms used / Max parms available:   20/ 2000

Then press E to set up the site parameters. For the initial solution, set the Clock Polynomials line to 1 1 1 0 0 * * * for every station bar one (the reference station - choose the same one as was used for a reference in fourfit). You can change between the stations with the N and P button. Make sure that all other parameter estimations are set to zero for all stations & beware of errant keypresses. Once ready, you can geenerate a solution by pressing Q.

The solution will be writtento the screen. Check that the Weighted RMS in delay is < 1 microsecond (if greater, suggests strong outliers or systematic problems). Also, check the second page where the clock solutions are listed:

  1. HART15M  CL 0 14/02/05 06:59                -7063.162 ns      1.95 ns
    2. HART15M  CL 1 14/02/05 06:59                   76.263 D-14    9.44 D-14
    3. HART15M  CL 2 14/02/05 06:59                    14.85-14/d    9.13-14/d
    4. HART15M  NG 14/02/05 06:59                  -44.17 mm         48.37 mm
    5. HART15M  EG 14/02/05 06:59                 -140.18 mm         41.28 mm
    6. HART15M  NG 14/02/06 06:59                  138.50 mm         46.24 mm
    7. HART15M  EG 14/02/06 06:59                 -151.57 mm         36.44 mm

If CL 0 is greater than 100000 ns (100 microsec) or CL 1 > 100000 D-14 in rate then you will need to apply an apriori clock model or, better yet, recorrelate the data. See the solve guide for instruction on applying a priori model. If the solution is ok, return to the main menu with O

 GAMB  S-band: r.m.s. of whole solution is OK
 GAMB  $b$ group ambiguities are resolved, but solution looks bad 
Are you sure, that you really need to save these results in scratch file ?

These are usually caused a few bad sources/observations - press S to save the results and return to the main window.

 Clock polynomials
 14/02/05 06:59                1 1 1 0 0 * * *
 14/02/05 13:30                1 1 1 0 0 * * *
 14/02/05 19:12                1 1 1 0 0 * * *
Automatic outliers elimination utility                    ELIM  Ver. 2007.08.01
                   �����������
$14FEB05XT <3>
'SUPMET__COMB1'
Information about residuals is not available yet

(X) Maximum uncertainty: 1000. psec        (A) Acceleration factor: 1

(U) Upper threshold for outlier detection: 400. psec      (E) EQM speed-up: No

(C) Cutoff limit for outlier detection:    not specified  (Y) Type: baseline

(Q) Quality code limit: 5                  (D) Update residuals

(-) Singularity check                      (') Change suppression method

(V) Verbosity level:    1                  (N) Confirm each action: no

(S) Return to Optin and save results       (O) Return to Optin without saving

(P) Proceed for outliers elimination       (T) Toggle elimination/restoration

(W) Weights update                         (H) On-line help

Prepare level 4 database & convert to NGS format

You can do this at any stage after running GAMB which handles the ambiguity resolution & ionospheric correction process. It's a two-tage process of creating an updated database & the running a DOS script to convert it to NGS format.

Database update                                           NEWDB Ver. 2007.06.05
-------------------------------------------------------------------------------


Database to be updated: $14FEB05XT

        Reweighting: (G)roup (P)hase (B)oth (#)None

 (1) Clk & atm parms, constraints, data configuration-->Yes No

 (2) Group delay editing and ambiguities--------------->Yes No

 (3) Group ionosphere calibration:--------------------->Yes No

 (4) Met., cable, phase cal status:-------------------->Yes No

 (5) Ocean, relativity, pole tide status:-------------->Yes No

 (6) Phase delay editing and ambiguities--------------->Yes No

 (7) Phase ionosphere calibration:--------------------->Yes No

(N)ext Menu             (O)PTION    (R)efresh Screen  (D)efault standard
(T)erminate SOLVE       Re(S)elect Databases

” or “key not found”, etc, this is usually due to the database having already been updated. If this fails, you'll need to exit calc/solve , delete the offending file from /data/vlbi/mark3_dbh/ and from the catalogue. You can do this with the catlg program. Use the de command (with -1 as the “password”), select e for entry and then confirm the deletion of the most recent version. Once deleted, you should be able to run the update as per normal.