Calc/Solve is currently only installed on ops7 in the AuScope control room. To create a database from fourfit output, you should have the /data/ drive on magilla NFS-mounted (''sudo mount magilla:/data/ /mnt/magilla/''). You should then create a dbedit input file with the following syntax: $Input_Filter Duplicate_elim H S Sort T Time_Window earliest latest Username -AU- Aprioris NO UT1-PM NO Ephemeris NO $Input_Source Mark4_Directory /mnt/magilla/AUSTRAL/aust19/1259/ $Substitutes Star '3C274'='M84' $Output_Database Database N $14FEB05XA History AUST19 -AU- Band X Time_Window earliest latest db_format DBH $Output_Database Database N $14FEB05SA History AUST19 -AU- Band S Time_Window earliest latest db_format DBH $EOF You can find previous experiment's files in the home area as ''aust19.dbedit'', etc. If you are making a test database to check on the data quality, it's recommended to set the database name with "T" as the final letter. This will save some difficulties if you encounter problems with the data and end up having to reprocess it for the final release. dbedit options are described [[http://lacerta.gsfc.nasa.gov/mk5/help/dbedit_02.html#section2.1|here]] in some detail. The ''Substitutes'' is used in this case to rename a source to match the Calc/Solve database. Once the aust19.dbedit file is ready, you can run dbedit with ''dbedit aust19.dbedit -h /mnt/magilla/AUSTRAL/aust19/1259/aust19.corr.perl''. The -h switch is used to append the correlation report to the database as a history entry. If dbedit completes successfully, you should have two newly created databases in /data/vlbi/mark3_dbh/. If the data has already been verified, then these databases are ready for submission to the IVS FTP server (There is a script in ~observer called DBsubmit which should put the files in the correct places. Make sure that the correlation report is emailed off together with the database submission.