The VLT Science Archive System

M.A. Albrecht, E. Angeloni, A. Brighton, J. Girvan, F. Sogni, A.J. Wicenec, H. Ziaeepour (ESO)

The ESO Very Large Telescope (VLT) will deliver a Science Archive of astronomical observations well exceeding the 100 Terabytes mark already within its first five years of operations.

In order to safely store and subsequently maximize the scientific return of these data, ESO is undertaking the design and development of both On-Line and Off- Line Archive Facilities. The main objective of these facilities is to provide the infrastructure needed to offer the Science Archive as an additional instrument of the VLT. The main capabilities of the system will be a) handling of very large data volume, b) routine computer aided feature extraction from raw data, c) data mining environment on both data and extracted parameters and d) an Archive Research Programme to support user defined projects.

This talk reviews the current planning and development state of the VLT Science Archive project.

Astronomy On-Line - the world's biggest astronomy event on the World-Wide-Web

R. Albrecht (ESO/ESA/ST-ECF), R. West (ESO), C. Madsen (ESO)

This educational programme was organised in a collaboration between ESO, the European Association for Astronomy Education (EAAE)and the European Union (EU) during the 4th European Week for Scientific and Technological Culture. Asronomy on-line brings together thousands of students from all over Europe and other continents. Learning to use the vast resources of tomorrow's communication technology, they also experience the excitement of real-time scientific adventure and the virtues of international collaboration. The central web site is hosted by ESO, and there are satellite sites in all participating countries. Astronomy on-line features an electronic newspaper which reports on current astronomical events and provides hotlinks to appropriate sites. The "Marketplace" provides a gateway to collaborative projects, astronomical data and software, and to professional astronomers in the different participating countries who have agreed to support the project.

TAKO: Astro-E's Mission Independent Scheduling Suite

A. Antunes (HSTX/GSFC), P. Hilton (Hughes/ISAS), A. Saunders (GSFC)

The next generation of Mission Scheduling software will be cheaper, easier to customize for a mission, and faster than current planning systems. TAKO ("Timeline Assembler, Keyword Oriented", or in Japanese, "octopus") is our in-progress suite of software that takes database input and produces mission timelines. Our approach uses openly available hardware, software, and compilers, and applies current scheduling and N-body methods to reduce the scope of the problem. A flexible set of keywords lets the user define mission-wide and individual target constraints, and alter them on the fly. Our goal is that TAKO will be easily adapted for many missions, and will be usable with a minimum of training. The especially pertinent deadline of Astro-E's launch motivates us to convert theory into software within 2 years. The design choices, methods for reducing the data and providing flexibility, and steps to get TAKO up and running for any mission are discussed herein.

Suggested presentation: Demo

European Southern Observatory - MIDAS + SKYCAT

K. Banse, M. Albrecht (ESO)

The latest version of On-line Midas as used for ESO's Dataflow System will be shown. The Archive group will demonstrate SKYCAT, the catalog display tool which is based on ESO's Real Time Display (RTD).

P.S. Also, the 96NOV Midas CD-ROM should be ready.

Invited talk

Parkes Multibeam realtime object-oriented data reduction using AIPS++

D. Barnes (University of Melbourne), L. Staveley- Smith, T. Ye, T. Oosterloo (Australia Telescope National Facility (ATNF))

We present algorithms and their implementation details for the Australia Telescope National Facility (ATNF) Parkes Multibeam Software. The new thirteen-beam Parkes 21 cm Multibeam Receiver is being used for the neutral hydrogen (HI) Parkes All Sky Survey (HIPASS). This survey will search the entire southern sky for neutral hydrogen in the redshift range -1200 km/s to +12600 km/s; with a limiting column density of approximately 5 x 10^{17} atoms per square centimetre. Observations for the survey began late in February, 1997, and will continue through to the year 2000.

A complete reduction package for the HIPASS survey data has been developed, based on the AIPS++ library. The major software component is realtime, and uses advanced inter-process communication coupled to a graphical user interface (GUI), provided by AIPS++, to apply bandpass removal, flux calibration, velocity frame conversion and spectral smoothing to 26 spectra of 1024 channels each, every five seconds. AIPS++ connections have been added to ATNF-developed visualization software to provide on-line visual monitoring of the data quality. The non-realtime component of the software is responsible for gridding the spectra into position-velocity cubes; typically 200000 spectra are gridded into an 8 x 8 degree cube.

Object-Relational DBMSs for Large Astronomical Catalogues Management

A. Baruffolo, L. Benacchio (Astronomical Observatory of Padova)

Astronomical catalogues containing from million up to hundreds million records (e.g. Tycho, GSC-I, USNO-A 1.0) are becoming commonplace. While they are of fundamental importance to support operations of current and future large telescopes and space missions, they appear also as powerful research tools for galactic and extragalactic astronomy.

Since even larger catalogues will be released in a few years (e.g. the GSC-II), researchers are faced with the problem of accessing these databases in a general but efficient manner, in order to be able to fully exploit their scientific content.

Traditional database technologies (i.e. relational DBMSs) have proven to be inadequate for this task. Segmentation of catalogues in a catalogue-specific file structure accessed by a set of programs provide fast access but only limited query capabilities. Other approaches, based on new access technologies, must thus be explored.

In this paper we describe the results of our pilot project aimed at assessing the feasibility of employing Object-Relational DBMSs for the management of large astronomical catalogues. In particular we will show that the database query language can be extended with astronomical functionalities and to support typical astronomical queries. Further, access methods based on spatial data structures can be employed to speed up the execution of queries containing astronomical predicates.

Parallel tree N-body code: data distribution and DLB on CRAY T3D for large simulations

U. Becciani, V. Antonuccio-Delogu (Obs. Catania), G. Erbacci (CINECA), R. Ansaloni (Silicon Graphics Italy), M. Gambera (Obs. Catania), A. Pagliaro (Inst. Astr. Catania)

During the last 3 years we have developed an N-Body code to study the origin and the evolution of the Large Scale Structure of the Universe (Becciani et al. 1996, 1997). The code, based on the Barnes-Hut tree algorithm, has been developed under the CRAFT environment to share work and data among the PEs involved in the run. The main purpose of this work was the study of the optimal data distribution in the T3D memory, and a strategy for the Dynamic Load Balance in order to obtain good performances when runnig the simulation with more than 10 million particles. To maximize the number of particles per second, updated at each step, we studied the optimal data distribution and the criterion to choose the PE executing the force computing phase and to reduce the load unbalancing. The results of our tests show that the step duration depends on two main factors: the data locality and the T3D network contention. Increasing data locality we are able to minimize the step duration. In a very large simulation, due to network contention, an unbalanced load arises. The DLB consists in implementing an automatic structure: each PE executes the force compute phase only for a fixed portion N of all the bodies residing in the local memory. The computation of all the remaining bodies is shared among all the PEs. The obtained results show that, fixing the PEs and the particles number, the same N value gives the best performance both in uniform and clustered condition. This means that it's possible to fix this quantity which can be usefully adopted during the running time without introducing any significant overhead to obtain a good Dynamic Load Balance.

Teaching Astronomy via the Internet

L. Benacchio, M. Brolis (Padova Astronomical Observatory), I. Saviane (Padova Department of Astronomy)

A project is being carried on at the Padova Astronomical Observatory, with the partnership of the Italian Telecom, whose aim is to supply high quality multimedia educational material and tools to public schools (14-18 teen) via the Internet. A WWW server has been set up, and in the early experimental phase, a number of schools in the city area will be connected to the Observatory and hence to the Internet. Teachers and students will use it for the annual course (1997/98)in astronomy.

Our purpose is to remove a lack in the Astronomical WWW sites currently active, i.e., by providing a carefully designed server which will deliver reliable information in a structured way and, at the same time, take full advantage of the medium. Apparently there are no sites devoted to the explanation of the basic astronomy and astrophysics, at the middle school level.

Our educational approach is based on the so-called Karplus cycle, that is: introduction of new concepts by means of proposed experiments, discussion and selection of the discovered laws and 'correct' explanation of the observations and application to new situations. To avoid the subject will try to `fit` the new knowledge into his/her already existing wrong schemes, a preliminary phase for the removal of existing misconceptions is present. In turn, the knowledge is introduced according to a hierarchical order of concepts.

The medium involved allows the full exploitation of this approach, since it permits direct experimentation by means of animation, java applets, and personalized answers to the proposed questions. Also, automatic tests and evaluations can be straightforwardly implemented. In this respect, it has a clear advantage over the traditional static book. Finally, the user can choose his/her own pace and path through the material offered.

We also propose a number of hands-on activities which extend and reinforce the concepts, and which require the presence of the teacher as an active guide.

Suggested presentation: Demo

Demonstration of Starlink Software

M. Bly, R. Warren-Smith (Rutherford Appleton Laboratory)

We will demonstrate the latest Starlink applications which will be available on the late summer Starlink CD-ROM release. The highlights include FIGARO with handling of error data, the GAIA GUI , an enhanced CURSA (catalogue access) and FIGARO running from the IRAF CL. Applications using the NDF data library will be able to work with non-NDF (eg IRAF) data formats using on-the-fly data conversion.

The demo needs:

SUN Ultra model 140 workstation or higher, with 8-bit colour TGX graphics (or equivalent) 128Mb memory 4Gb disk (2Gb to be available for software and data) 20-inch colour display 4x or better CD-ROM drive Solaris 2.5 operating system with CDE (Common Desktop Environment) Sparc Compiler 4.2: Fortran 77, C and C++ Internet connection if available.

Nightly Scheduling of ESO's Very Large Telescope

A.M. Chavan (ESO), G. Giannone (Serco), D. Silva (ESO), T. Krueger, G. Miller (STScI)

A key challenge for ESO's Very Large Telescope (VLT) will be responding to changing observing conditions in order to maximize the scientific productivity of the observatory. For queued observations, the nightly scheduling will be performed by staff astronomers using an Operational Toolkit. This toolkit consists of a Medium Term Scheduler (MTS) and a Short Term Scheduler (STS) both integrated and accessible through a Graphical User Interface (GUI). The Medium Term Scheduler, developed by ESO, will be used to create candidate lists of observations based on different scheduling criteria. There may be different candidate lists based on "seeing", or priority, or any other criteria that is selected by the staff astronomer. A MTS candidate list is then selected and supplied to the Short Term Scheduler for detailed nightly scheduling. The STS uses the Spike scheduling engine, which was originally developed by STScI for use on the Hubble Space Telescope.

Invited talk:

CyberHype or Educational Technology - What is being learned from all those BITS?

C. Christian

I will discuss various information technology methods being applied to science education and public information. Of interest to the group at STScI and our collaborators is how science data can be mediated to the non-specialist client/user. In addition, I will draw attention to interactive and/or multimedia tools being used in astrophysics that may be useful, with modification, for educational purposes. In some cases, straightforward design decisions early on can improve the wide applicability of the interactive tool.

How to exploit an astronomical gold mine: Automatic classification of Hamburg/ESO Survey spectra

N. Christlieb (Hamburg Obs.), G. Graeshoff (MPI History of Science, Berlin / Univ. Hamburg), A. Nelke, A. Schlemminger (Univ. Hamburg), L. Wisotzki (Hamburg Obs.)

We present methods for automatic one-dimensional classification of digitized objective prism spectra developed in the course of the Hamburg/ESO Survey (HES) for bright QSOs. The HES covers about 10,000 deg2 in the southern extragalactic sky, yielding several million usable spectra in the range 12 <- B <- 17. The resolution of the HES spectra is ~ 15A at Hg, permitting to detect the strongest stellar absorption features.

Our astronomical aims are:

Construction of complete samples of quasar candidates by identification of objects that do not have stellar absorption patterns, via classification with the Bayes rule plus a reject option.

Construction of complete samples of rare stellar objects, e.g. White Dwarfs, horizontal branch A- stars, or extremely metal poor halo stars. Here a minimum cost rule is used.

"Simple" classification of all HES spectra with the Bayes rule, e.g. to provide a data basis for cross- identification with surveys in other wavelength ranges.

The feature space used for classification consists of equivalent widths of stellar absorption features.

We report on the discovery of the extremely metal poor halo star HE 2319-0852, [Fe/H]=-3.50.5, which was discovered in a test survey for these objects on a few of our plates using simulated spectra as learning sample.

Building Software Systems from Heterogeneous Components

M. Conroy, E. Mandel, J. Roll (SAO)

Over the past few years there has been a movement within astronomical software towards "Open Systems". This activity has resulted in the ability of individual projects or users to build customized processing systems from a variety of existing components. We will present examples of user customizable systems that can be built from existing systems where the only requirements on the components are:

a) Use of a common parameter interface library.

b) Use of FITS as the input/output file format.

c) Unix + X-windows environment

With these three minimal assumptions it is possible to build a customized image-display driven data analysis system as well as automated data reduction pipelines.

Suggested presentation: Demo

Demonstration of AIPS++

T. Cornwell, B. Glendenning (NRAO), J. Noordam (NFRA)

AIPS++ is a package for radio-astronomical data reduction now under development by a consortium of radio observatories. It is currently in beta release and it expected to be publicly released in late 1997.

Description of demo:

We will demonstrate the beta version of AIPS++. This will consist of a demonstration by an AIPS++ Project Member at regularly scheduled times. In addition, we will make the system available for use by others.

VRML and Collaborative Environments: New Tools for Networked Visualization

R.M. Crutcher, R.L. Plante, and P. Rajlich (National Computational Science Alliance/Univ. of IL)

We present two new applications that engage the network as a tool for astronomical research and/or education. The first is a VRML (virtual reality modeling language) server which allows users over the Web to interactively create three-dimensional (3D) visualizations of FITS images contained in the NCSA Astronomy Digital Image Library (ADIL). The server's Web interface allows users to select images from the ADIL, fill in processing parameters, and create renderings featuring isosurfaces, slices, contours, and annotations; the often extensive computations are carried out on an NCSA SGI supercomputer server without the user having an individual account on the system. The user can then download the 3D visualizations as VRML files, which may be rotated and manipulated locally on virtually any class of computer. The second application is the ADILBrowser, a part of the NCSA Horizon Image Data Browser Java package. ADILBrowser allows a group of participants to browse images from the ADIL within a collaborative session. The collaborative environment is provided by the NCSA Habanero package which includes text and audio chat tools and a white board. The ADILBrowser is just an example of a collaborative tool that can be built with the Horizon and Habanero packages. The classes provided by these packages can be assembled to create custom collaborative applications that visualize data either from local disk or from anywhere on the network.

Fitting and Modeling of the AXAF Data with the ASC Fitting Application

S. Doe, A. Siemiginowska, M. Ljungberg, W. Joye (SAO)

The AXAF mission will provide X-ray data with unprecedented spatial and spectral resolution. Because of the high quality of these data, the AXAF Science Center will provide a new data analysis system - part of which includes a new fitting application. Our intent is enable users to do fitting that is too awkward with or beyond the scope of existing astronomical fitting software. Our main goals are: 1) to take advantage of the full capabilities of the AXAF, we intend to provide a more sophisticated modeling capability (i.e., models that are f(x,y,E,t), models to simulate the response of AXAF instruments, and models that enable "joint-mode" fitting, i.e., combined spatial-spectral or spectral-temporal fitting); and 2) to provide users with a wide variety of models, optimization methods, and fit statistics. In this paper, we discuss the use of an object- oriented approach in our implementation, the current features of the fitting application, and the features scheduled to be added in the coming year of development. Current features include: an interactive, command-line interface; a modeling language, which allows users to build models from arithmetic combinations of base functions; a suite of optimization and fit statistics; the ability to perform fits to multiple data sets simultaneously; and, an interface with SM and SAOtng to plot or image data, models, and/or residuals from a fit. We currently provide a modeling capability in one or two dimensions, and have recently made an effort to perform spectral fitting in a manner similar to XSPEC. We also allow users to dynamically link the fitting application to algorithms written by users. Our goals for the coming year include: incorporating the XSPEC model library as a subset of models available in the application; enabling "joint-mode" analysis; and adding support for new algorithms.

New Capabilities of the ADS Abstract and Article Service

G. Eichhorn, A. Accomazzi, C.S. Grant, M.J. Kurtz, S.S. Murray (SAO)

The ADS abstract service at: has been updated considerably in the last year. New capabilities in the search engine include searching for multi-word phrases and searching for various logical combinations of search terms. Through optimization of the custom built search software, the search times were decreased by a factor of 4 in the last year.

The WWW interface now uses WWW cookies to store and retrieve individual user preferences. This allows our users to set preferences for printing, accessing mirror sites, fonts, colors, etc. Information about most recently accessed references allows customized retrieval of the most recent unread volume of selected journals. The information stored in these preferences is kept completely confidential and is not used for any other purposes.

Two mirror sites (at the CDS in Strasbourg, France and at NAO in Tokyo, Japan) provide faster access for our European and Asian users.

To include new information in the ADS as fast as possible, new indexing and search software was developed to allow updating the index data files within minutes of receipt of time critical information (e.g., IAU Circulars which report on supernova and comet discoveries).

The ADS is currently used by over 10,000 users per month, which retrieve over 4.5 million references and over 250,000 full article pages each month.

Invited talk:

Object-Oriented Experiences with GBT Monitor and Control

J.R. Fisher (NRAO)

The Green Bank Telescope Monitor and Control software group adopted object-oriented design techniques as implemented in C++. The experience has been generally positive, but there certainly have been many lessons learned in the process. The long analysis phase of the OO approach has lead to a fairly coherent software system and a lot of module (class) reuse. Many devices (front-ends, spectrometers, LO's, etc.) share the same software structure, and implementing new devices in the latter part of the project has been relatively easy, as is to be hoped with an OO design. One disadvantage of a long design phase is that it is hard to evaluate progress and to have much sense for how the design satisfies the real user needs. In retrospect, the project might have been divided into smaller units with tangible products at early and mid stages of the project. The OO process is only as good at the requirement specifications, and the process has had to deal with continually emerging requirements all though the analysis, design, and implementation phases. Changes and fixes to core software modules have not been too painful, but they do require a robust software version control system. Large and medium scale test of the system in the midst of the implementation phase has required quite a bit of time and coordination effort. This has tended to inhibit progress evaluations.

News on the ISOPHOT Interactive Analysis (PIA)

C. Gabriel (ESA-SSD)

The ISOPHOT Interactive Analysis system, a calibration and scientific analysis tool for the ISOPHOT instrument on board ESA's Infrared Space Observatory (ISO), has been further developed while ISO is under operations.

After 18 months of ISO operations considerable experience has been achieved by the use of PIA, which led to several new features in the package. This experience has been achieved not only by the ISOPHOT Instrument Dedicated Team in its tasks of e.g. calibration, instrument performance check and refinement of analysis techniques, but also by a large number of ISOPHOT observers in around 100 astronomical institutes all over the world. PIA is distributed freely since longer than one year to all astronomers wishing to use it for ISOPHOT data reduction and analysis. The feedback from the different users is reflected not only in the extension of the analysis capabilities but also on a more friendly graphical interface, a better documentation, an easier installation. So became PIA not only a very powerful calibration tool but also the software tool of choice for the scientific analysis of ISOPHOT data.

In this paper we will concentrate on some of the PIA enhancements, by the scientific analysis, by the documentation and by the related general service to the astronomical community.

Distributed Searching of Astronomical Databases with Pizazz

K. Gamiel (National Computational Science Alliance/Univ. of IL)

The NCSA Pizazz SDK is an information retrieval communications toolkit that includes code and applications for for easily integrating existing database systems into a globally accessible, open standards-based system. The toolkit includes a TCP- based server and information retrieval protocol engine that handles all network communication between client and server. The server is designed as a drop-in application, extending the functionality of legacy database systems and creating a global infrastructure of astronomical database resources. The toolkit uses the Z39.50 information retrieval protocol.

Achieving Stable Observing Schedules in an Unstable Worls

M. Giuliano (STScI)

Operations of the Hubble Space Telescope (HST) require the creation of stable and efficient observation schedules in an environment where inputs to the plan can change daily. Operations must allow observers to adjust observation parameters after submitting the proposal. PIs must also be informed well in advance the approximate date of an observation so they can plan for coordinated observations and data analysis. Scheduling is complicated due to ongoing changes in the HST operational parameters and because the precise ephemeris for HST is not known in advance. Given these constraints, it is not possible to create a single static schedule of observations. Instead scheduling should be considered an ongoing process which creates and refines schedules as required. Unlike other applications of replanning, the HST problem places a premium on ensuring that a replan minimally disturbs the existing plan. A process and architecture is presented which achieves these goals by dividing scheduling into long term and short term components. The long term scheduler, the main focus of this paper, provides approximate 4-8 week plan windows for observations. A plan window is a subset of an observation's constraint windows, and represents a best effort commitment to schedule in the window. The long range planner ensures plan stability, balances resources, and provides the short term scheduler with the proper mixture of visits to create week long schedules. The short term scheduler builds efficient week long observation schedules by selecting observations who have plan windows open within the week.

The long term scheduler as implemented within the Spike software system provides support for achieving stable observations schedules. Spike models the planning process as a function which takes as input a previous plan, a set of proposals, and some search criteria and produces as output a new plan. Stability is ensured by using the input plan to guide the creation of a new plan. Through this mechanism Spike can handle instabilities such as changed observation specifications, out of date observation products, and errors in loading observation specifications. Special routines are provided for planning and ensuring stability for observations linked by timing requirements (e.g. Observation 2 after observation 1 by 6-8 days). Spike provides a combination heuristic and stochastic search engine with user defined weights for finding near optimal plans.

Suggested presentation: Demo

New applications of Artificial Neural Networks in stellar spectroscopy

R. Gupta, R.K. Gulati (IUCAA), H.P. Singh (University of Delhi)

Recently, Artificial Neural Networks (ANNs) have been proved to be a very efficient technique for stellar spectral classification in spectral regions of UV, Optical and IR. Various groups including ours have used this technique with the main aim to evolve an automated procedure for use with large upcoming stellar spectral libraries which will be the major outcome of several ongoing surveys being undertaken at many astronomical observatories. In an attempt to explore newer areas of applications, we have extended this technique to obtain stellar atmospheric parameter Teff; determination of a third dimension of classification from UV data i.e. color excess E(B-V) and applying Principal Component Analysis (PCA) as a pre-processor before using the ANN on spectral data. In the application of stellar atmospheric effective temperature, we present the first ever attempt to obtain Teff for dwarf stars by ANN technique and obtain results comparable to earlier attempts by other statistical techniques. In the second application, we show that ANNs can extract a third dimension of spectral classification viz. color excess E(B-V) apart from already established spectro- luminosity classification. Finally, we have used PCA prior to applying ANN on our first results on Optical spectra and improved the efficiencies for classification.

Description of computer demo

It is proposed that in the demo session, the three new applications will be shown on a suitable computer platform (UNIX based SUN or DEC type workstation having Mathematica and SM i.e. Super Mongo for plotting the graphs etc. with telnet/ftp options for downloading the programs are requested to be made available at the conference venue for this purpose). Though it may not be possible to run the training sessions during the limited period of the ADASS conference, the demo will amply prove the performance of ANNs in the three new areas mentioned in the abstract above by running the test sessions and showing graphically the classification accuracies.

New applications of Artificial Neural Networks in stellar spectroscopy

R. Gupta, R.K. Gulati (IUCAA), H.P. Singh (University of Delhi)

Recently, Artificial Neural Networks (ANNs) have been proved to be a very efficient technique for stellar spectral classification in spectral regions of UV, Optical and IR. Various groups including ours have used this technique with the main aim to evolve an automated procedure for use with large upcoming stellar spectral libraries which will be the major outcome of several ongoing surveys being undertaken at many astronomical observatories. In an attempt to explore newer areas of applications, we have extended this technique to obtain stellar atmospheric parameter Teff; determination of a third dimension of classification from UV data i.e. color excess E(B-V) and applying Principal Component Analysis (PCA) as a pre-processor before using the ANN on spectral data. In the application of stellar atmospheric effective temperature, we present the first ever attempt to obtain Teff for dwarf stars by ANN technique and obtain results comparable to earlier attempts by other statistical techniques. In the second application, we show that ANNs can extract a third dimension of spectral classification viz. color excess E(B-V) apart from already established spectro- luminosity classification. Finally, we have used PCA prior to applying ANN on our first results on Optical spectra and improved the efficiencies for classification.


It is proposed that in the demo session, the three new applications will be shown on a suitable computer platform (UNIX based SUN or DEC type workstation having Mathematica and SM i.e. Super Mongo for plotting the graphs etc. with telnet/ftp options for downloading the programs are requested to be made available at the conference venue for this purpose). Though it may not be possible to run the training sessions during the limited period of the ADASS conference, the demo will amply prove the performance of ANNs in the three new areas mentioned in the abstract above by running the test sessions and showing graphically the classification accuracies.

ASCA: An International Mission

P. Hilton (Hughes/ISAS) and A. Antunes (HSTX/GSFC)

The ASCA X-ray satellite mission involves scientists from Japan, America, and Europe. Each year more than 400 targets are observed by ASCA. The process starts with the electronic submission of a proposal and ends with the delivery of a data tape. A successful observation depends on organization within the operations team and efficient communication between the operations team and the observers. The methods used for proposals, scheduling, coordinating observations, quick-look plots, and data delivery are presented.

A Trans-Continental Local Area Network

G. Hunt (NRAO)

The National Radio Astronomy Observatory (NRAO) has facilities at 17 different locations scattered throughout the USA. These vary in size from the major laboratories occupied by research and support staff to the ten individual antennas of the Very Long Baseline Array. As is typical in astronomy, many sites are in remote locations, which are not well served with modern communication capabilities. Until 1996, the NRAO's internal network was achieved via the Internet; most sites simply had a local port to the Internet and the traffic was routed tortuously to the other locations. The burgeoning demand for Internet bandwidth was (and still is) growing faster than the services could be enhanced, and this led to intolerably slow response times and unacceptably low achieved data rates. To solve this problem, the NRAO acquired a frame relay intranet to connect ten of its locations. The service is provided under the federal FTS2000 contract by AT&T. The operating cost is approximately the same as the multiple Internet connections, but with vastly improved throughput and reliability.

Suggested presentation: Demo

SAOimage: The Next Generation (SAOtng)

W. Joye (SAO)

SAOtng is a new version of the popular SAOimage display program. It is a superset of the ximtool program developed at NOAO for IRAF and as such, utilizes the NOAO widget server (included in this package). It also incorporates the X Public Access mechanism to allow external processes to access and control its data, GUI functions, and algorithms. SAOtng supports direct display of IRAF images and FITS images (and easily can support other file formats), multiple frame buffers, region/cursor manipulation, several scale algorithms, many colormaps, and easy communication with external analysis tasks. It is highly configurable and extensible to meet the evolving needs of the astronomical community.

Other People's Software

E. Mandel (SAO)

The past decade has witnessed an explosion of astronomical software development. Many talented individuals have made great efforts to develop libraries, programs, and analysis systems that serve the needs of their projects and the community at large. As a result, computer-aided astronomical research is vastly more sophisticated than it was even a few years ago.

One of the most striking features of this software explosion, considered as a whole, is the tremendous overlap in its functionality and its target audience. Why do we continually re-invent the astronomical software wheel? Why is it so difficult to use "other people's software"?

An approach to these questions can be made by contemplating the implications of the statement that "many talented individuals have made great efforts" to develop astronomical software. It is not enough simply to talk about cooperation in a theoretical way, and it is not possible to force cooperation. Rather, we need to investigate practically how individuals (and software) can begin to act in concert without sacrificing their independence or compromising the needs of their projects. This paper will examine these issues through specific examples and will offer a practical starting point for software cooperation, centered on the concept of "minimal software buy-in".

Suggested presentation: Demo

The HEASARC on the Web

T. McGlynn, W. Pence, N. White, S. Calvo, M. Duesterhaus, P. Newman, S. Zobair, C. Rosen, E. Sabol, L. Brown, B. O'Neel, S. Drake, L. Whitlock (NASA/GSFC)

The High Energy Astrophysics Science Archive Research Center is the premier archive for NASA's high-energy astronomy data. This demonstration will show a few of the capabilities that are provided to astronomers and the public at the HEASARC. These include:

- A complete, portable environment for analyzing high- energy data using FTOOLS and XANADU.
- Astronomical information from more than a dozen missions on the web using W3Browse.
- Immediate and easy searching of many HEASARC and remote Web sites for astronomical objects using the Astrobrowse system discussed at this meeting.
- The multi-wavelength virtual telescope: SkyView.
- Sophisticated educational resources for children and adults of all ages.

Description of the demo:

We intend to access the resources of the HEASARC over the Web and show a few programs running locally (or at least displaying windows locally). The demonstration will require a computer with graphical capabilities and Netscape V3.0 or higher or Internet Explorer V3.0 and higher. It would be substantially preferable to have a Unix system rather than a PC or a Mac. Some tools we would like to demonstrate require X-windows capability.

Using Java for Astronomy: The Virtual Radio Interferometer Example

N.P.F. McKay, D.J. McKay (NRAL, Jodrell Bank)

This paper discusses the ramifications of the relatively new Java computing environment on the field of astronomy. It indicates the advantages and disadvantages of the language, concentrating on the specific needs of various aspects of astronomical computing. To illustrate some of the concepts, the authors present the Virtual Radio Interferometer (VRI), which allows the demonstration of aperture synthesis by simulating the Australia Telescope Compact Array and other observatories in software. This Java applet may be used as an educational tool for teaching interferometry as well as a utility for observers.

Constructing and Reducing Sets of HST Observations Using Accurate Spacecraft Pointing Information

A. Micol (ST-ECF), P.D. Bristow (Bristol), B. Pirenne (ST-ECF)

The implementation of On-The-Fly Re-Calibration at the ST-ECF and CADC goes some way towards alleviating the problem of obtaining good and timely calibration of HST exposures. However, the data access paradigm is still to consider each exposure individually, re- calibrate them and offer the results to users, who subsequently process the data further.

We describe here techniques to automatically group together HST WFPC2, STIS and NICMOS exposures for cosmic ray removal, co-addition and combination into "super high resolution'" images. We show that the execution of these tasks has been made essentially automatic.

The ST-ECF archive now offers the possibility to select "associations'" of datasets and the automatically combined final products. A further spin off of this project is that more reliable pointing information for all exposures is provided.

Invited talk:

Noise Detection and Filtering using Multiresolution Transform Methods

F. Murtagh (Univ. Ulster & Obs. Astron. Strasbg.) and J.L. Starck (CEA)

We discuss noise in the context of astronomical image processing, including common noise models and variance stabilization. We then look at why multiresolution analysis has been so successful in allowing noise to be filtered. Multiresolution transforms such as the wavelet transform and the pyramidal median transform are briefly described. An innovatory data structure in the context of such transforms, termed the multiresolution support, is defined. Two important application fields are investigated: firstly, image compression; and, secondly, high-quality automated noise determination. In the context of the latter, we show how the same approach can be used for filtering out anomalously high-valued artifacts (cosmic ray glitches).

Invited talk

Early Universe Simulation

Software for Astrophysical and Cosmological Hydrodynamics Simulations

Michael L. Norman (Laboratory for Computational Astrophysics Department of Astronomy and National Center for Supercomputing Applications University of Illinois at Urbana-Champaign)

Hydrodynamic simulations of diverse astrophysical systems-- from stars to the large scale structure of the IGM--is in a period of explosive growth as a means to rationalize observational data as well as to understand the underlying physics. Pacing this growth is the availability of sophisticated simulation software as well as computer power. I describe our efforts at the Laboratory for Computational Astrophysics to create a suite of community simulation codes for astrophysical and cosmological simulation, and disseminate them to the international research community. In particular, I review the status of the ZEUS codes which are in widespread use for astrophysical fluid dynamics simulations. I also describe KRONOS, a new code for hydrodynamic cosmological simulations, and 4D2, a 3D data visualization tool. These codes are illustrated with applications to supernova remnant modeling and cosmological structure formation at high redshift. More information about the LCA can be found at the following website:

The VizieR system to access astronomical data

F. Ochsenbein (CDS, Observatoire astronomique de Strasbourg)

The VizieR system, initially developed in a colloboration between the ESIS project (European Space Information System of the European Space Agency) and CDS (Centre de Donnes astronomiques de Strasbourg), is a tool which allows to access individual records within a very large collection of astronomical catalogues: the query can be a set of ontraints applied to any column of any of the ~ 4000 tables from about 1500 astronomical catalogues. For tables related to astronomical objects, the query can also involve the proximity to a specified target object.

The VizieR system essentially consists in the conversion of the data files and their associated description into tables of a relational DBMS. The ingestion of such a large number of data files is mainly the result of a large effort of rationalisation and standardisation in the documentation of the astronomical catalogues conducted at CDS and the associated data centers for several years: the inclusion of new catalogues and data-sets is now achieved by a pipe-line processing of the standardized catalog documentation.

An essential part of the VizieR system consists in a set of dedicated tables contains the description of all tabular parameters: the META database or the Reference Directory. These tables can themselves be referenced in queries, allowing e.g. to find out catalogues providing data related to polarisation. This set of META tables, which will be briefly presented, was recently improved, leading to enhanced performances.

The very large catalogues (e.g. the USNO-A1.0 catalogue of 488 million stars) can also be queried from VizieR through dedicated interfaces; the access to such large catalogues is however restricted to small fractions of the sky for obvious performance reasons. A generalized access from celestial positions to a large number of catalogues will also be presented.

The service can be accessed from URL:

Accessing astronomical data over the WWW using datOZ

P.F. Ortiz (Univ. Chile)

The Department of Astronomy of the University of Chile hosts a number of astronomical databases created with datOZ (Ortiz, 1997). This is not an ftp site, but one in which databases are accessed interactively and in real time by the use of an HTML interface using HTTP as the communications protocol. Data can be retrieved by the users in a highly flexible way, from lists of user selected quantities, to customized plots, including any additional multimedia information accessible for the database elements. The latest additions to the system point in two directions: a) allow not only the access to the values of the defined variables, but the construction of quantities based on the database variables, which in turn can be used for plotting, definition of constraints, neighbour search, or data retrieval modes, and b) allow the retrieval and further correlation of information stored in different datOZ's databases, which need not to be in the same machine.

The benefits of accessing catalogs over the WWW are described for both research and educational purposes. It's not the same to get an ASCII table with many columns of data on your computer than accessing the same data with a flexible interface which allows immediate visualization of the data. One of the main features of this database system is its capacity to plot quantities upon request for defined subsets of the data stored in the database. Visualization and information organization are key aspects of exploiting the content of a database.

At our Departemet, we have set databases to be consulted (like catalogs) and databases for especific projects, which are protected from the rest of the world and serve the purpose of keeping the data in a handy way and also support all the stages of the research projects. In this paper I intend to describe the capabilities of public databases and the benefits a researcher or student can get from such a tool.

Data Analysis with ISOCAM Interactive Analysis - preparing for the future

S. Ott (ESA) and R. Gastaud (CEA)*

The ISOCAM Interactive Analysis System (CIA) was developed to calibrate ISOCAM, the infrared camera on board the Infrared Space Observatory (ISO) and to perform its astronomical dataprocessing.

Currently data processing consists of multiple steps:

o data preparation
o cross-talk correction
o dark current substraction
o flat-fielding
o deglitching
o transient correction
o mosaic generation
o generation of spectra
o interfacing with non-ISO data products

We will review the algorithms currently implemented in CIA, present some examples and will outline forseen changes to accommodate future improvements for these algorithms and the user interface.

*on behalf of the CIA development team at CEA/Saclay, ESA/ISO/Villafranca, IAS/Orsay and IPAC/Pasadena

The XMM Science Operations Centre Control System

N. Peccia, F. Jansen, H. Nye, M. Merri, J. Riedinger (ESA)

The XMM satellite is a powerful new X-ray observatory, providing astronomers with the tools necessary to make advances in high-energy astrophysics into the next century. Launch is scheduled for 1 August 1999 by an Ariane 5 launch vehicle with insertion into a highly eccentric, high inclination orbit, having a perigee height of 240 km, apogee height of 114,000 km and an orbital period of 47.8hrs. The mission will be of long duration, with a design lifetime of 27 months and consumables to extend the mission for up to ten years. The payloads will be used to generate a high throughput of data, which will eventually be accessible to the world wide astronomical community. The XMM Science Control System (XSCS) is based at the Vilspa groundstation in Villafranca, Spain. It is responsible for allowing the astronomical community to submit proposal for observations via the Proposal Handling Subsystem (PHS). These proposals are then manually evaluated for scientific worth, those accepted being passed to the Sequence Generation Subsystem (SGS). This produces a schedule, the preferred observation sequence (POS), on a revolution by revolution basis, based on the submitted proposals and orbital constraint data supplied by Flight Dynamics at ESOC, Darmstadt Germany. Once a POS is produced it is sent to ESOC for further processing and eventually uplinked to the spacecraft from the XMM Mission Control System (XMCS) at ESOC. All telemetry received from the spacecraft by the XMCS is routed back to the XSCS via TCP/IP over dedicated leased lines. This data is received by the Payload Monitoring Subsystem (PMS) where standard SCOS-1 functions provide monitoring of instrument housekeeping data. Additional, mission specific functions, provide monitoring of the science data. In particular the Quick Look Analysis (QLA) subsystem enables the monitoring of the science data images as they are received. This also enables ?changes requests? to be generated which are sent to the XMCS for uplink to the spacecraft. These are in effect commands enabling the settings of an instrument to be tuned during an observation. All science telemetry received by the PMS is further processed by the Observation Data Subsystem (ODS) which turns the data into FITS format files. These FITS files are then send to the Survey Science Centre (SSC) in Leicester, U.K., for detailed scientific analysis (Pipeline Processing). The products of this analysis are then returned to the XSCS where they are storied in the Archive Management System (AMS) and made available to the scientific community. External users can browse the archive, via a web based interface, and request data from it. Due to the size of the data sets the main medium of data distribution is CD-ROM. Additionally the SOC is responsible to maintain the Instruments On-Board Software during the Mission. The Instrument Software System ( ISS ) replicates all the Instruments Software Development Environments (SDEs) and validates the on- board software changes ( only instrument controllers) by using a simulator ( SOC simulator ). The XSCS is implemented on a variety of platforms, Alpha/OpenVMS for the PMS and ODS, Sun/Solaris for the PHS, SGS, AMS and QLA. All web interfaces are based on a Windows NT platform. In order to handle the data produced during the lifetime of the XMM mission (currently estimated at 2 - 4 Terabytes) a Hierarchical Storage Mechanism (HSM) is used in the AMS to provide sufficient data storage capacity.

A Queriable Repository for HST Telemetry Data, A Case Study in using Data Warehousing for Science and Engineering

J.A. Pollizzi, III, K. Lezon (STScI)

The Hubble Space Telescope (HST) generates on the order of 7,000 telemetry values, many of which are sampled at 1Hz, and with several hundred parameters being sampled at 40Hz. Such data volumes would quickly tax even the largest of processing facilities. Yet the ability to access the telemetry data in a variety of ways, and in particular, using ad hoc (i.e. no a priori fixed) queries, is essential to assuring the long term viability and usefulness of this instrument. As part of the recent NASA initiative to re-engineer HST's ground control systems, a concept arose to apply newly available data warehousing technologies to this problem. The Space Telescope Science Institute was engaged to develop a pilot to investigate the technology and to create a proof-of- concept testbed that could be demonstrated and evaluated for operational use. This paper describes this effort and its results.

First a background is given to contrast data warehousing technology from the more familiar relational database technology. Then how HST telemetry challenges any attempt at a queriable system is described. The paper follows with the various choices and compromises we made in how to best use a particular warehouse product in meeting this goal. The paper then summarizes with lessons learned and some actual benchmark results taken from the effort.

Invited talk:

The VLT Data Flow System : A Progress Report

Peter J. Quinn (ESO) In order to realize the maximum scientific return from the VLT, ESO has undertaken to develop an end-to-end data flow system from proposal entry to science archive. The VLT Data Flow System (DFS) is being designed and implemented by the ESO Data Management Division in collaboration with VLT and Instrumentation Divisions. Tests of the DFS started in October 1996 on ESO's New Technology Telescope. Since then, prototypes of the Phase 2 Proposal Entry System, VLT Control System Interface, Data Pipelines, Online Data Archive, Data Quality Control and Science Archive System have been tested. Several major DFS components have been run under operational conditions since July 1997. This presentation will give a summary of the current state of the VLT DFS, the experience gained from prototyping on the NTT and the planning for VLT operations beginning in early 1999.

Cost-effective System Management

S. Schaller (Steward Observatory)

Quality system management of computer workstations can be achieved with relatively small manpower requirements, if the right cost-effective administrative design decisions are made.

Recent dramatic changes in the price/performance ratio of computer hardware have modified the model used to distribute computer resources and especially the usage of the network.

Invited talk:

Managing Information Technology Infrastructure

J. Schwarz (ESO)

Now that the computer has passed from a useful to an indispensable tool in every astrophysics research or teaching institution, its administration has become serious business. Today's user expects continuous, uninterrupted service from the computer and its environment of networks and peripherals. System crashes and scheduled downtime during working hours, which used to be regarded as facts of life, are now considered unacceptable by most users. In the face of expectations such as these, infrastructure planning, dependable maintenance service, and resource optimization are required of anyone who is responsible for an Information Technology (IT) center.

The IT center can no longer afford to let equipment wear out before replacing it; a planned roll-over of all IT hardware is necessary. Likewise, only an implausibly manpower-rich computing facility can afford to maintain a widely varied mix of computer makes and models; an inexpensive clone of a major brand may prove to be a major maintenance headache later. In the very bastion of academic freedom, hard choices may have to be made about just how much freedom the user is to be allowed in choosing a desktop computer system. "You can buy it but we won't support it" is easy to say, but hard to fall back on when the user's odd-brand machine crashes before a major observing proposal deadline. And is Information Technology the 'core business' of an observatory? Why not outsource the whole messy affair? These and other related issues will be discussed (though not resolved) at the conference.

The New User Interface for the OVRO Millimeter Array

S. Scott, R. Finch (Caltech/OVRO)

A new user interface for the OVRO Millimeter Array is in the early phase of implementation. The basic requirements are to provide monitoring and control of the array with a web based interface from anywhere on the Internet. Low bandwidth connections that are typically attained with analog modems must provide a satisfying interface.

Web browser based clients have been developed to display monitoring information from the array that is updated in realtime. These monitor clients are written in Java and are tailored to present different sets of information about the array. The data are usually rendered as numerical text values in data cells, with the color of the cells representing the state of the component. Many of the data cells are arranged in tabular form to provide a dense display and to emphasize the relationship of the data. The data cells in the table also function as the selection menu to initiate live realtime plots. This flexible plotting is a key feature of this new interface. Data caching and compression are two techniques used to enhance the utility of the monitor clients.

The monitoring clients are fed data over TCP/IP circuits from Unix server programs. These C++ server programs get their data from a large shared memory. The shared memory is updated twice a second by demon programs that receive datagrams from microprocessors embedded in the array hardware.

The control aspect of the interface is in the design stages as the Java security model evolves to provide the flexibility needed for remote access to control functions.

The ISO Spectral Analysis Package ISAP

E. Sturm, O.H. Bauer, D. Lutz, E. Wieprecht (MPE), G. Helou, I. Khan, S. Lord, B. Narron, S. Unger (IPAC), M. Buckley (RAL), F. Vivares (CESR), L. Verstraete (IAS), P.W. Morris (ISO SOC)

We briefly describe the ISO Spectral Analysis Package ISAP. This package has been and is being developed to process and analyse the data from the two spectrometers on board ISO, the Infrared Space Observatory of the European Space Agency (ESA). ISAP is written in pure IDL. Its command line mode as well as the widget based graphical user interface (GUI) are designed to provide ISO observers with a convenient and powerful tool to cope with data of a very complex character and structure. ISAP is available via anonymous ftp and is already in use by a world wide community.

Suggested presentation: Demo

Demonstration of the Grid OCL System

I. Taylor (Univ. Wales)

For the demo, I will be giving a 'hands on' demonstration of the Grid OCL system. People in the conference can come up and try Grid OCL out for themselves or can be given a tour of the Grid OCL system including demonstrations of how to : create and connect units; group units to make custom configurations; import, export and display various different types of data; use the 2D Graphical displayer to zoom in/out etc; and also write custom units which can then, in turn, be used in the same way as other units within the Grid OCL system.

IRAF Suggested presentation: Demo

D. Tody and the IRAF Group (NOAO/IRAF)

The Image Reduction and Analysis Facility (IRAF) is an image processing and astronomical data analysis system developed by the IRAF group at the National Optical Astronomy Observatories (USA). The ADASS IRAF demo will feature the newly released IRAF Version 2.11, the latest X11IRAF visualization tools, portions of the Mosaic Data Handling System used for CCD data acquisition, and the distributed object and message bus data system framework being developed by NOAO. The IRAF group will be available to answer questions about the various aspects of the IRAF installations, reductions and programming tools.

Specific description of the demo:

We plan to demo the IRAF system. We will need a workstation or the equivalent: 17-20" color monitor, 600Mb of diskspace, a minimun of 32Mb of memory, SunOS or Solaris operating system running X Windows, DAT, Exabyte, or CD drive for downloading the software initially. We prefer a Sun workstation if at all possible.

Open IRAF Message Bus and Distributed Object Technology


In this decade we have seen software become increasingly large and complex. Although programs may be well structured internally, using hierarchically structured class libraries to modularize the code, our programs have grown so large that they are monolithic and inflexible with a high degree of interdependence of the internal modules. The technology needed to address this problem, being developed now by academia and commercial consortiums, is known as distributed objects. Distributed objects allow major software modules to be encapsulated as objects and instantiated as separate processes, threads, or classes (procedures). Tying it all together is the message bus, which provides flexible services and methods for distributed objects to] communicate with one another. Applications are built by linking precompiled components and services together at runtime on the message bus. This paper presents the message bus and distributed object framework being developed by NOAO and outside collaborators as part of the Open IRAF initiative. This project is funded in part by the NASA ADP and AISR programs.

The IRAF Mosaic Data Reduction Package

F. Valdes (NOAO/IRAF Group)

The design of a data reduction system for the NOAO Mosaic Camera was presented at the ADASS VI conference. This paper reports on the first implementation of this system as an IRAF package. Details and examples of the tools are given. Particular attention is given to the central role played by the coordinate system in producing high quality final images from a mosaic of CCDs.

World Coordinate Systems as Objects

R.F. Warren-Smith (Starlink, Rutherford Appleton Laboratory), D.S. Berry (Starlink, University of Manchester)

We describe a new library (AST) which provides a flexible high-level programming interface for handling World Coordinate Systems (WCS) in astronomy. It includes, but is not limited to, a wide range of celestial coordinate systems and supports the Digital Sky Survey plate solutions and the draft FITS WCS proposals amongst other possibilities.

AST is a general tool for describing coordinate systems and the relationships which exist between them, and for storing this information in astronomical datasets. It also supports the retrieval, exploration and manipulation of WCS information, such as searching for coordinate systems, transforming coordinate values and aligning data arrays. A comprehensive range of graphics facilities is provided, including the plotting of annotated grids and axes.

Using AST, programmers may define their own new coordinate systems based on the built-in facilities. In future we plan to extend these facilities to include time, wavelength, and other coordinate domains.

Internally, AST makes extensive use of object-oriented techniques, but is written in ANSI C for portability and presents a conventional interface to FORTRAN and C programmers. Dependence on other software has been minimised and AST operates independently of any data system or environment, although it is easily interfaced to such systems when required. This, together with its flexibility, makes AST suitable for a wide range of projects.

The Interaction of the ISO-SWS Pipeline software and the ISO-SWS Interactive Analysis System

E. Wieprecht (ISO/MPE), F. Lahuis (ISO/SRON)

We describe the interaction of the ISO SWS Pipeline software and ISO SWS Interactive Analysis system on different hardware platforms.

The pipeline software is coded in FORTRAN, to work within an environment designed by the European Space Agency (ESA). It is used for bulk processing without human interaction, the final product (Auto Analysis Result) being distributed to the observers. The pipeline s/w is designed in a modular way, with all major steps separated in software sub-modules.

The Interactive Analysis system is set up as a tool box in an Interactive Data Language (IDL) environment. The IA system was developed to fullfil mainly four requirements:

* Debugging of pipeline s/w
* Analyse the performance of SWS
* Determine the calibration parameters
* Scientific analysis of SWS data

Some parts are coded in FORTRAN, also the FORTRAN pipeline modules are included in this system. Thus, it is possible within the IA system to execute the pipeline step by step.

We describe measures taken to design the Interactive Analysis system in a hardware-independent way, including the interface to the FORTRAN pipeline modules. Currently it is running on VAX-stations, DEC- Alpha stations, SOLARIS and HP computers under VMS and UNIX. The software is used at eight sites under different hardware conditions.

All systems of IA are controlled by a configuration control system (CoCo).


F.Lahuis, ISO SWS Data Analysis, this conference
M.F. Kessler et al. The Infrared Space Observatory (ISO) mission,1996, A&A 315, L27
Th. de Graauw et al., Observing with the ISO Short- Wavelength Spectrometer,1996, A&A 315, L49
Interactive Data Language (IDL), Research Systems, Inc. (preliminary reference)