AUTEC User’s Guide
justinv@hep.stanford.edu
Stanford University
October 2002
This document gives a brief overview of important information for collaborators (especially new ones) in the AUTEC acoustic neutrino detection study. Hopefully the organization of code, data, and notes is good enough that it will not need to be changed and I’ll be able to easily answer questions down the line.
The
study has a web site at http://hep.stanford.edu/neutrino/SAUND/. Of particular interest to collaborators is
the “internal” section, which has several useful documents sketching the
progress of the study.
Jack
Cecil cecil@wpb.nuwc.navy.mil
Our main Navy contact at AUTEC who helped make our
use of it possible. Knows a lot about
the big picture of the array and is happy to help us use it well. Located most often in West Palm Beach, FL (not at AUTEC itself).
Dan
Belasco belasco@wpb.nuwc.navy.mil
Our main contact at AUTEC Site 3, where our data
acquisition system is. Dan runs our
system (turns it off when they are running a test and back on afterward);
installs new data acquisition software when we give it to him; and transfers
data to an external hard drive to be shipped to us. He knows a good deal about computers (would be able to make minor
upgrades to the system such as adding memory) and about the AUTEC electronics
(hydrophones + amplifiers). Phone
contact with Site 3 is very difficult, but Dan responds to email daily.
Nikolai
Lehtinen nleht@phys.hawaii.edu
Worked on the project as a post-doc during 2000 and
2001. Researched and simulated the
expected signal in preparation for the experimental study. Designed the basics of the acquisition
system (digital matched filter). Did a
good deal of analysis between when we started taking data in July 2001 and when
he left in December 2001. Knows the
physics very well.
Yue
Zhao newtonii@stanford.edu
Worked on analysis during the summer of 2002. Analyzed the reflections of signals off the
sea floor. The analysis is useful for
constraining the source location (to a cone) and verifying that we understand
the acoustics well (transmission and reflection coefficients can be
approximated).
Justin
Vandenbroucke justinv@hep.stanford.edu
Worked on the project as an undergrad and then as a
temporary employee from December 2000
to October 2002. Wrote most of the
online data acquisition software, starting from a core that Nikolai wrote. Installed it at Site 3 and calibrated it
with a light bulb drop in July 2001.
Upgraded the system in December 2001.
Analyzed data from July 2001 through October 2002 and made upgrades to
the acquisition system that were transferred to Dan Belasco electronically.
Most
of the AUTEC work was done initially on
two PC’s (on hep26.stanford.edu by Justin and on hep14.stanford.edu by Nikolai
and Yue). Some was also done on
hep.stanford.edu, a UNIX system, as well as on the Stanford leland computing
system. In September 2002 almost all
data, analysis tools, and notes were moved to erinyes.stanford.edu, a Linux
cluster of 10 CPU’s. All AUTEC work on
erinyes is located in /Data/AUTEC/.
Here
is the procedure we have been following for data transfer:
The external hard drive for transferring data is shipped back and forth between Site 3 and Stanford via Jack Cecil in West Palm Beach. When new shipments of data arrive on the external hard drive, the data are copied to erinyes. This can be done with WinSCP and typically takes overnight. A zip file is also made for each daily folder of data, and these zip files are copied to both hep and hep26. The data are verified by reading all fields of all files and then Dan is notified that he can delete them from the Site 3 PC (he leaves copies there until this notification in case there is a problem). The hard drive is then shipped to Jack Cecil:
Jack Cecil
Naval Undersea Warfare
Center
Detachment AUTEC
801 Clematis St.
West Palm Beach, FL 33401
561-832-8566
Both
Jack and Dan Belasco are notified via email to expect the hard drive. Dan will then hold the hard drive until
notified to ship it. The PC at Site 3
can hold 40 GB of data. If it fills up
be can start compressing it, but this is time consuming. Expect ~ 2 weeks in each direction for
transit between Stanford and Site 3.
As of this summer I got more organized and started online log files in which I made entries detailing progress on the project. Each entry includes the names of relevant Matlab programs, so this is probably the best way to learn about existing analysis code. Some entries contain very useful information that would be time-consuming to duplicate, but unfortunately you have to wade through many entries to find it. Hopefully briefly skimming the logs would be enough to have an idea of what’s there. Yue made similar notes that are separted into different documents by subject. Both Yue’s and Justin’s notes are available in the “internal” section of the web site. Useful images (including plots) can be copied directly from these pages in .jpg format.
All
past and current versions of the Labview online (data acquisition) software are
archived on erinyes in /Data/AUTEC/AUTEC_programs_archive/. Initially both current and old versions of
each subprogram were kept in the same directory, but this became
cumbersome. Now each version of the
code has a folder to itself (AUTEC_programs_yyyy.mm.dd/) and is independent of
all other versions.
Significant
upgrades can be made on hep26. They can
be debugged to a large degree using the simulated setup on hep26. hep26 is an exact duplicate of system at
Site 3 with the single exception of having a faster hard drive. It has the same data acquistion card and BNC
interface. The BNC interface is plugged
into a signal generator that can generate three independent signals (which are
then duplicated among 7 digitized channels).
To upgrade the code, my procedure has been: duplicate the most recent
folder; rename it according to the
date; make changes to the new version.
I then make the folder and its contents read-only, compress the folder,
and put it on a web site. I then email
the URL to Dan Belasco who can download it and install it. It should have the format
AUTEC_programs_yyyy.mm.dd.zip. That way
he keeps all old versions and can revert to one if necessary. It’s worthwhile to have Dan email ~5 minutes
of data to you so you can verify it further than is possible with the hep26
setup.
Data
can be analyzed on erinyes using hep26 as an X terminal. There are two X servers on hep26: XSecure
Pro (a trial version which quits after a couple hours but is fast), and Exceed
(a trial version good for ~40 days which doesn’t quit but is quite slow with
the Matlab editor for some reason). All
online software is on erinyes in /Data/AUTEC/AUTEC_analysis. This folder has several subfolders,
including m_files, mat_files, and text_files.
m_files contains Matlab programs.
Matlab can be run from an ssh connection to erinyes (with Exceed or
Xsecure running) by typing “matlab –nodesktop” (the nodesktop option makes it
run faster). There are many independent
Matlab programs that perform various analysis tasks and there are several
utilities that are used by many programs, as you will discover by reading
several analysis programs.
Recent Matlab programs are better documented than
older ones. The m_files folder contains
many programs, some of which are obsolete.
Some but not all of these have been moved to m_files/obsolete/. Matlab programs typically write the output
of their analysis to .mat files in the mat_files folder. A program called program_name.m typically
analyzes an hour of data and writes it to a file program_name.yyyy.mm.dd.hh.mat
in the folder mat_files/name/. Another
program would then be used to plot the outuput in these .mat files. Most plots can be produced within a minute
or so with a .m program that reads .mat files generated by a different program
that may have taken many hours to run.
The m_files directory contains many independent
pieces of analysis code. Most are
reasonably named, but it’s generally difficult to find a particular one for a
particular task. The best way to find
them is likely to look through Yue and my notes as described above. These give the names of relevenat .m
programs.