IPP Software Navigation Tools IPP Links Communication Pan-STARRS Links

Changeset 8703


Ignore:
Timestamp:
Aug 29, 2006, 6:19:30 PM (20 years ago)
Author:
eugene
Message:

updates to text

Location:
trunk/doc/design
Files:
2 edited

Legend:

Unmodified
Added
Removed
  • trunk/doc/design/ippCDRresponse.tex

    r8140 r8703  
    2020
    2121% -- Revision History --
    22 \RevisionsStart
     22%\RevisionsStart
    2323% version     Date         Description
    24 DR.01     & 2006.07.28 & First Draft \\ \hline
    25 \RevisionsEnd
     24%00     & 2006.08.13 & Initial Release \\ \hline
     25%\RevisionsEnd
    2626
    2727\tableofcontents
     
    104104This question appears to be principally concerned with perceived
    105105overlap between the IPP DVO system and the PSPS Object Data Manager
    106 (ODM).  Where there is superficial similarity between the systems, it
     106(ODM).  While there is superficial similarity between the systems, it
    107107is our belief that the development of two complementary systems is not
    108 only worthwhile, but extremely important for a variety of reasons. 
    109 
    110 The IPP operational requirments include the analysis of images to
     108only worthwhile, but extremely important for a variety of reasons.
     109
     110The IPP operational requirements include the analysis of images to
    111111produce the P2, P4$\Delta$, and P4$\Sigma$ detections, the Static Sky
    112112images, the Astrometric and Photometric Reference catalog, and the
     
    128128and objects, along with the photometric and astrometric calibration
    129129information.  There are several motivations for defining a separate
    130 entity within the IPP specifially for this task:
     130entity within the IPP specifically for this task:
    131131
    132132\begin{itemize}
    133 \item the PSPS system is foreseen as the definitive view on the object
     133\item The PSPS system is foreseen as the definitive view on the object
    134134  database problem, while the IPP requires the ability to manipulate
    135135  and update the astrometric and photometric calibrations. 
    136 \item the PSPS system requires a high-level of sophistication in the
     136\item The PSPS system requires a high-level of sophistication in the
    137137  scientific queries which may be performed.  While critical, this
    138   sophistication will result in a much longer development timeline for
     138  sophistication will result in a much longer development time line for
    139139  the PSPS. 
    140140\item By performing the analysis of the astrometric and photometric
     
    146146  trade-off allows the IPP DVO system to operate at a much higher
    147147  throughput than is possible for the PSPS, and perform more
    148   reprocessing operations.
     148  re-processing operations.
    149149\end{itemize}
    150150
    151151It is useful to ask if the PSPS could have been ready within the
    152 necessary timeframe, would the IPP use that instead of the DVO system
     152necessary time frame, would the IPP use that instead of the DVO system
    153153for support of photometric and astrometric calibrations.  The answer
    154154would be 'yes' only if the PSPS requirements were modified to allow it
     
    157157subsystems after they have both matured?  My opinion, and I believe,
    158158that of the project management, is that this choice would result in
    159 development of a system which would be substantially overengineered
     159development of a system which would be substantially over-engineered
    160160for its role in either of the two subsystems, with concomitant
    161161increase in the cost of development and support.  By dividing the
     
    187187  psphot and ppImage upgrades.
    188188\item {\bf Release Capabilities} Image warping, Image differencing,
    189   Image stacking, object modelling for difference images (loading PSF
     189  Image stacking, object modeling for difference images (loading PSF
    190190  from an external source, fitting positive and negative sources),
    191191  analysis of the OTA guide kernel, detrend images convolved with the
     
    220220We have worked with the other PS1 teams to define complete ICDs
    221221between all of the interacting subsystems.  The IPP interfaces with 4
    222 existing subsystems, and will interact with several other science
    223 clients.  We have completed the definitions of the ICDs for the
    224 Camera-IPP (PSDC-940-003), OTIS-IPP (PSDC-940-004), IPP-MOPS
     222existing subsystems, and will interact with other as-yet-undefined
     223science clients.  We have completed the definitions of the ICDs for
     224the Camera-IPP (PSDC-940-003), OTIS-IPP (PSDC-940-004), IPP-MOPS
    225225(PSDC-940-005), and IPP-PSPS (PSDC-940-006).  These are now part of
    226226the PSDC document tree.
     
    277277The CDR Review Committee requests a plan for finalizing the analysis
    278278algorithms within the IPP.  There are three stages to freezing the
    279 analysis algorithims within the IPP:
     279analysis algorithms within the IPP:
    280280\begin{itemize}
    281281\item Algorithm conceptual design. 
     
    287287The IPP SSDD defines the conceptual details of nearly all of the
    288288analysis algorithms in Sections 5, 6, and 7.  Of these descriptions,
    289 only Sections 5.5.5, describing the stacking of the Static Sky,
    290 and Section 7, defining the Static Sky photometry analysis, are
    291 insufficient in detail for the analysis to be well understood.  Also
    292 missing from version 00 of the document is a discussion of the
    293 creation of the astrometric and photometric reference catalogs.  A new
    294 version of the SSDD with these three sections updated \note{will be
    295 posted by DATE}.
     289only Sections 5.5.5, describing the stacking of the Static Sky, and
     290Section 7.2, defining the Static Sky photometry analysis, are
     291insufficient in detail for the analysis to be well understood.  The
     292discussion of the analysis used to generate the astrometric and
     293photometric reference catalog are included in Section 4 on the design
     294of DVO, and are somewhat sparse.  A new version of the SSDD with these
     295three sections updated will be posted by Aug 30.
    296296
    297297Table~\ref{algorithms} lists the IPP algorithms, the relevant program
    298298in the IPP tree, the development state of the program with respect to
    299299the identified analysis, and the data needed to set the algorithm
    300 parameters.  In a separate document (REF), we present a detailed
    301 discussion of the choices to be made and guidelines on making those
    302 choices.
     300parameters.  In a separate document (`How to Define IPP Analysis
     301Parameters'), we present a detailed discussion of the choices to be
     302made and guidelines on making those choices.
    303303
    304304There are a handful of decisions which need to be made which have an
     
    332332  analysis be minimized since it is impossible to afford the I/O load
    333333  demanded by a large number of input fringe images.  A related
    334   question is that of how to subselect the night-time fringe images
     334  question is that of how to sub-select the night-time fringe images
    335335  for best effect, if sky fringe images are used.  Based on experience
    336336  from CFHT/Megacam, it may be possible to use fringe images selected
    337337  on the basis of the time of night, but this must be tested for
    338   Haleakala.  It seems unlikely at this time that a spectral skyprobe
    339   will be available for the start of PS1, so we cannot rely on such a
    340   device to guide our choices.
     338  Haleakala.  An alternative strategy, in which master fringe images
     339  are combined based on their consistency with the science image, has
     340  also been implemented for the IPP.  The decision between these
     341  options will be guided by further testing of Megacam images and real
     342  Haleakala data in a range of conditions.  It seems unlikely at this
     343  time that a spectral skyprobe will be available for the start of
     344  PS1, so we cannot rely on such a device to guide our choices.
    341345
    342346\item {\bf Static Sky Cell definition} What is the layout of the
     
    376380  range of field angles into the static sky.  These will have a range
    377381  of image qualities.  We will need to set the cuts to trade-off
    378   between degredation of the final image quality versus degredation of
     382  between degradation of the final image quality versus degradation of
    379383  the signal-to-noise in the final image.  In general, our guidance
    380384  for the Static Sky is to maximize our ability to measure accurate
     
    391395  common seeing (for a stable PSF across the image for difference
    392396  image)?  Is it possible to measure the weak lensing parameters
    393   sufficiently accuractly from the stacked image with knowledge of the
     397  sufficiently accurately from the stacked image with knowledge of the
    394398  PSF?  To what level of detail is the PSF model required?  Is it
    395399  necessary to perform the weak lensing analysis (and other galaxy
     
    410414steps.  The functional flow of these different analysis steps can be
    411415seen in IPP SSDD Figures 22-30.  The use of the Q/A measurements is
    412 not summarized very clearly within the text of version 00; \note{this
    413 will be updated in the new SSDD release}.  Within the IPP, the
    414 analysis stages use these measurements to mark input images as
    415 accepted or rejected.  These assessments are passed back to the OTIS
    416 subsystem, along with the image statistics measured for the input
    417 images.  OTIS has the option of setting more stringent filters on the
    418 input images and re-observing images on the basis of the IPP feedback,
    419 even if the IPP accepted images which OTIS re-observes.  There is also
    420 a system-wide plan in place to use feedback from the IPP and from OTIS
    421 to guide the project's choices for survey strategies and science
    422 goals.
     416not summarized very clearly within the text of version 00; this will
     417be updated in the new SSDD release.  The types of Q/A measurements are
     418also discussed below.
     419
     420Within the IPP, the analysis stages use these measurements to mark
     421input images as accepted or rejected.  These assessments are passed
     422back to the OTIS subsystem, along with the image statistics measured
     423for the input images.  OTIS has the option of setting more stringent
     424filters on the input images and re-observing images on the basis of
     425the IPP feedback, even if the IPP accepted images which OTIS
     426re-observes.  There is also a system-wide plan in place to use
     427feedback from the IPP and from OTIS to guide the project's choices for
     428survey strategies and science goals.
    423429
    424430\subsection{Detrend Images}
    425431
    426 The input detrend images all have their pixel count levels measured
    427 for each chip.  Input images which have counts or fluxes outside of a
     432The input detrend images have their pixel count levels measured for
     433each chip.  Input images which have counts or fluxes outside of a
    428434defined range will be flagged and excluded from any detrend analysis.
    429435For example, the flat-field images should never use input images which
    430436are saturated, nor should the dark image analysis use input images
    431437with flux levels wildly outside of the nominal range.  Both conditions
    432 are evidence that the observing process was performed inappropriately.
     438are evidence that the data were incorrectly obtained.
    433439
    434440In addition to raw pixel values, the input detrend images are
    435 contronted with the resulting master detrend images.  In general, the
    436 effect corrected by the master detrend image should adaquately correct
     441confronted with the resulting master detrend images.  In general, the
     442effect corrected by the master detrend image should adequately correct
    437443each of the input raw detrend images.  The residual scatter of the
    438444detrended raw images should be small.  As part of the detrend creation
     
    483489  scatter, and substantial photometric offsets are all evidence of
    484490  problems with the observing conditions.  These may be the presence
    485   of clouds and/or haze, degredation of the optics, and/or extreme
     491  of clouds and/or haze, degradation of the optics, and/or extreme
    486492  image-quality problems.
    487493\end{itemize}
     
    500506result in large deviations between the components of an image stack,
    501507and will also result in large numbers of difference image detections.
    502 Similarly, bright stars with larger than expected halos or saturation
     508Similarly, bright stars with larger-than-expected halos or saturation
    503509regions will result in excess difference image detections.
    504510
     
    519525hardware to meet the processing and I/O requirements.
    520526
    521 That analysis is based largely on prototype tests of our processing
    522 algorithms, and is somewhat limited by being focused on the steady
    523 state operations.  We present here new numbers for the processing
    524 timeline based on the current baseline software on our existing
    525 baseline cluster hardware.
    526 
    527 For PS1, there is a significant processing challenge in the first 6 -
    528 9 months when only a fraction of the IPP storage hardware will be
    529 available.  This period is further complicated by the budgetary
    530 constraints placed on the IPP to limit the hardware purchase to a bare
    531 minimum.  An important area for clarification by the project is the
    532 processing requirement in the beginning of the project.  If the IPP is
    533 required to perform a complete Static Sky analysis on every image as
    534 it becomes available, then the total hardware required in the first
    535 6-9 months for processing must be increased.  If it is only necessary
    536 to stack sets of, for example, 4 images as they are available, then
    537 the requirements are somewhat reduced.  A trade-off must be made by
    538 the project to choose between these options.
     527That analysis was based largely on prototype tests of our processing
     528algorithms and has been made out-of-date by changes to the survey
     529plans.  We present here new numbers for the processing time line based
     530on the current baseline software on our existing baseline cluster
     531hardware.
     532
     533% For PS1, there is a significant processing challenge in the first 6 -
     534% 9 months when only a fraction of the IPP storage hardware will be
     535% available.  This period is further complicated by the budgetary
     536% constraints placed on the IPP to limit the hardware purchase to a bare
     537% minimum.  An important area for clarification by the project is the
     538% processing requirement in the beginning of the project.  If the IPP is
     539% required to perform a complete Static Sky analysis on every image as
     540% it becomes available, then the total hardware required in the first
     541% 6-9 months for processing must be increased.  If it is only necessary
     542% to stack sets of, for example, 4 images as they are available, then
     543% the requirements are somewhat reduced.  A trade-off must be made by
     544% the project to choose between these options.
    539545
    540546The data storage requirements are determined from the design reference
     
    546552images per year.  Combining these two, we find that the total number
    547553of raw image data is roughly 1.7PB (555,000 images).  In addition, we
    548 have a requirement for Static Sky storage (using 0.2 arcsec pixels) of
    549 roughly 300TB, and miscellaneous additional storage of nearly 100 TB.
    550 Our hardware purchase plan has a minimum total storage of 2.4PB,
     554have a requirement for Static Sky storage (assuming 0.2 arcsec pixels)
     555of roughly 300TB, and miscellaneous additional storage of nearly 100
     556TB.  Our hardware purchase plan has a minimum total storage of 2.4PB,
    551557giving us a margin of about 10\%.  Our plan is to purchase the
    552558hardware in 5 stages of 16 computers each, for a total system cost of
     
    558564required to buy all of the machines up front), we would require
    559565roughly 145 machines, increasing the cost of the cluster to a total of
    560 roughly \$2.2M.  We judge this to be a very low risk.
     566roughly \$2.2M.  We judge this to be a very low risk as hard disk
     567capacities continue to grow.
    561568
    562569We have performed timing test of the current versions of the IPP tools
     
    572579speeds of the CPU cores, but increasing numbers of cores per socket.
    573580By the end of this year, quad-core processors are expected.  If we
    574 stagger the purchase of the computers as planned, and make reasonable
    575 estimates for the number of cores available, we expect the final
    576 cluster configuration to have between 400 and 800 cores.  We will use
    577 600 as an estimate.  Note that it is possible to supplement the
    578 processing power of the cluster by buying 1U boxes with processors but
    579 no storage.  Each of these boxes cost roughly 15\% of a storage node
    580 and add an equal number of processor cores.  Such an option can be
    581 taken at any time, though it is not needed in our current development
    582 plan.
     581stagger the purchase of the computers as planned (5 sets of 16
     582machines), and make reasonable estimates for the number of cores
     583available, we expect the final cluster configuration to have between
     584400 and 800 cores.  We will use 600 as an estimate.  Note that it is
     585possible to supplement the processing power of the cluster by buying
     5861U boxes with processors but minimal storage.  Each of these boxes
     587cost roughly 15\% of a storage node and add an equal number of
     588processor cores.  Such an option can be taken at any time to
     589supplement the raw processing power, though it is not needed in our
     590current development plan.
    583591
    584592There are two potentially dominant analysis steps in the process: the
     
    589597takes roughly 16 seconds for a Megacam chip (single core), equivalent
    590598to roughly 38 seconds on a full GPC-1 chip.  Most other steps of the
    591 analysis scale are constant per image, and contribute only a few
    592 seconds relative to the 38 seconds.  We use 50 seconds per chip per
    593 core to judge the total processing power for the portion which scales
    594 by the number of images. 
     599analysis require a constant amount of time per image, and contribute
     600only a few seconds relative to the 38 seconds.  We use 50 seconds per
     601chip per core to judge the total processing power for the portion
     602which scales by the number of images.
    595603
    596604A useful statistic to judge the capability of the processing system is
    597 the time required to reprocess all images at the end of the survey.
     605the time required to re-process all images at the end of the survey.
    598606Given the total number of images above (555,000), the per-image
    599607analysis portion of the processing would require a total of $\sim 1.8
     
    617625predict the Pan-STARRS magnitudes of stars, then extrapolated the
    618626source counts to our magnitudes limits.  We find 50,000 objects per
    619 square degree above our threshold in this region.  If every image
    620 required the non-linear fitting for this density of objects, and we
    621 accept the 10ms time, this analysis would require a total of $\sim 2.0
    622 \times 10^9$ CPU core-seconds, or about 39 days on the 600 cores.
     627square degree above our threshold in this region.  If we make the
     628conservative assumption that every image required the non-linear
     629fitting for this density of objects, and that each object requies
     63010ms, this analysis would require a total of $\sim 2.0 \times 10^9$
     631CPU core-seconds, or about 39 days on the 600 cores.
    623632
    624633In conclusion, given the assumptions above, the processing power of
     
    650659requirements of the IPP for object databasing.  Some effort has been
    651660needed to make DVO completely suitable for its role within the IPP.
    652 Regardless of what object databasing system was chosen, a certain
    653 level of effort would have been required.  In this case, we were clear
    654 just how much would be required, and it was not large.
     661Regardless of what object databasing system was chosen, however, a
     662certain level of effort would have been required.  In the case of DVO,
     663we were clear what effort was required, and it was judged to be
     664reasonable.
    655665
    656666Of that effort, only the ability to support older table formats was
    657 required to maintain CFHT Elixir compatibility.  In fact, this is a
    658 feature which we would have added even if we did not want to maintain
    659 compatibility with CFHT's DVO installation.  We have found in our
    660 experience with the Elixir system that having a rigidly defined schema
    661 hindered the usability and extensibility of the DVO system.  The fixed
    662 tables made it difficult to add new elements to the database, and
    663 required multiple versions to support previously defined tables.  The
    664 new design allows us to be more flexible about changes without fear
    665 that this will break database instances which already exist.  One of
    666 the best ways we have found to test the DVO object databasing system
    667 is to engage students in science projects using DVO.  These projects
    668 explore the user interface and highlight problems and areas for
    669 possible improvement.  Such projects would not be possible if the
    670 users feared that their DVO instances would be unusable in the future
    671 because of lack of backwards compatibility.
     667required to maintain compatibility with the existing CFHT DVO
     668databases.  In fact, this is a feature which we would have added even
     669if we did not want to maintain compatibility with CFHT's DVO
     670installation.  We have found in our experience with the Elixir system
     671that having a rigidly defined schema hindered the usability and
     672extensibility of the DVO system.  The fixed tables made it difficult
     673to add new elements to the database, and required multiple versions to
     674support previously defined tables.  The new design allows us to be
     675more flexible about changes without fear that this will break database
     676instances which already exist.  One of the best ways we have found to
     677test the DVO object databasing system is to engage students in science
     678projects using DVO.  These projects explore the user interface and
     679highlight problems and areas for possible improvement.  Such projects
     680would not be possible if the users feared that their DVO instances
     681would be unusable in the future because of lack of backwards
     682compatibility.
    672683
    673684The PanTasks system used the existing Opihi command-line interface
     
    684695of a large set of real images obtained by the CFHT engineering staff
    685696over several years.  We also gain by discussions with our Elixir
    686 collegues about details of the analysis and possible sources of errors
     697colleagues about details of the analysis and possible sources of errors
    687698observed in the CFHT dataset.  The only cost to the IPP is in
    688699preventing excessive forking of the DVO databasing system, something
  • trunk/doc/design/ippSSDD.tex

    r8140 r8703  
    613613\paragraph{House keeping}
    614614
    615 \subparagraph{Lock sweeping} In the event that a Storage Object operation fails to complete successfully
    616 stale locks will have to be identified and removed from the IPP Pixel
    617 Data Server Database.  This should be done periodically by comparing
    618 the entries in the Lock table to the list of active nodes maintained
    619 by the IPP Controller.  It should also happen as soon as possible
    620 after a node goes offline (triggered by the IPP Controller marking a
    621 node as offline?).  A sweep must be /completed/ before an offline node
    622 can be marked on-line.
     615\subparagraph{Lock sweeping}
     616In the event that a Storage Object operation fails to complete
     617successfully stale locks will have to be identified and removed from
     618the IPP Pixel Data Server Database.  This should be done periodically
     619by comparing the entries in the Lock table to the list of active nodes
     620maintained by the IPP Controller.  It should also happen as soon as
     621possible after a node goes offline (triggered by the IPP Controller
     622marking a node as offline?).  A sweep must be /completed/ before an
     623offline node can be marked on-line.
    623624
    624625Once a node is determined to be offline all entries in the Lock table
     
    628629table.
    629630
    630 \subparagraph{Consistency sweeping} Periodically the IPP Pixel Data Server meta-data and Storage Object will need
    631 to be checked for sanity.  This would be similar to running fsck on a
    632 modern filesystem.  Consistency sweeping should include Lock sweeping
    633 and should be considered a super-set.
     631\subparagraph{Consistency sweeping}
     632Periodically the IPP Pixel Data Server meta-data and Storage Object
     633will need to be checked for sanity.  This would be similar to running
     634fsck on a modern filesystem.  Consistency sweeping should include Lock
     635sweeping and should be considered a super-set.
    634636
    635637\subsubsection{Nebulous Database}
     
    14911493photometrically corrected flats (-grid option).
    14921494
     1495\tbd{fill out this discussion in the analysis section on the
     1496astrometric and photometric reference catalog}.
     1497
    14931498\subsubsection{Uniphot : Zero Point Analysis}
    14941499
     
    14961501points for images and the spatial overlap information to determine a
    14971502best set of image zero points which have a specific time scale for the
    1498 atmospheric stability.  This analysis would be used after relative
     1503atmospheric stability.  This analysis is used after relative
    14991504photometry has been determined for data in DVO.  This analysis
    15001505currently is defined to unify the zero points of a collection of
     
    15031508photometry corrections for a collection of images distributed over a
    15041509large range in space and time, but still with significant
    1505 overlap. distritions with subustanailaccount for the c
    1506 
    1507 \subsubsection{Global Astrometry Analysis}
    1508 
    1509 This operation uses the reference and image detections to determine an
    1510 optical distortion model for the camera and static astrometry model
    1511 components.  The astrometry model includes: (1) field distortion
     1510overlap.
     1511
     1512\tbd{fill out this discussion in the analysis section on the
     1513astrometric and photometric reference catalog}.
     1514
     1515\subsubsection{relastro : Global Astrometry Analysis}
     1516
     1517This operation uses the reference and image detections to improve the
     1518astrometric reference catalog.  It determines an improved optical
     1519distortion model for the camera and static astrometry model
     1520components, and then applies the improved astrometric solutions to the
     1521observations to yield high-quality astrometry for the average object
     1522positions.  The astrometry model includes: (1) field distortion
    15121523introduced by the telescope optics, which is a smoothly-varying
    15131524function of the field position relative to the center of the telescope
     
    15161527along with chip-dependent plate-scale modifications needed to
    15171528represent tilts or warps of the individual detectors relative to the
    1518 ideal flat focal plane. .
     1529ideal flat focal plane. 
     1530
     1531\tbd{fill out this discussion in the analysis section on the
     1532astrometric and photometric reference catalog}.
    15191533
    15201534\subsubsection{DVO shell}
     
    28352849to an error upstream in the processing).
    28362850
     2851\tbd{add discussion of the choices to be made in generating the
     2852  static sky image stacks: interpolation methods, selection of input
     2853  images by IQ, smoothing of input images by their PSF, weighting and
     2854  clipping of input pixels}
     2855
    28372856Object analysis of the static sky images is {\em not} a part of the
    28382857Phase 4 analysis.  This processing is envisioned to take place
     
    28402859scheduled as a separate analysis task, probably run during the day at
    28412860a time when the computing infrastructure is not under significant load.
     2861
     2862\tbd{add discussion of the multiple image analysis and object
     2863  analysis without the static sky (ie, on all input images at once)}
    28422864
    28432865\subsubsection{Magic and Phase 4 Modifications}
     
    31613183parameter $\nu$, and a collection of annular aperture flux
    31623184measurements, all of which are also measured for the P4$\Sigma$
    3163 images.  In addition, the galaxy-shape parameters $Gamma_1 \&
     3185images.  In addition, the galaxy-shape parameters $\Gamma_1 \&
    31643186\Gamma_2$, along with the complete `polarization' terms are measured,
    31653187as well as a collection of annular aperture flux and variance
     
    31733195per second), it is only necessary to process the complete sky in a
    31743196year, or an average rate of $\sim$2 Mpix per second, or $< 1$\% of the
    3175 object analysis in the other analysis stages.
     3197object analysis in the other analysis stages.  These operations are
     3198all functions which will be performed within the PSPhot program using
     3199recipe options.
     3200
     3201\subsection{Astrometric and Photometric Reference Catalog}
     3202
     3203The IPP is responsible for generating the Astrometric and Photometric
     3204(AP) Reference Catalog.  The IPP provides several tools for performing
     3205this analysis.  The DVO programs \code{relphot}, \code{uniphot}, and
     3206\code{relastro} perform most of the operations required to generate
     3207the AP Reference Catalog.  These include the determination of the
     3208image zero-points, the identification of objects with significant
     3209variability, the detection of individual outlier measurements, the
     3210detection of objects with substantial astrometric error, the analysis
     3211of parallax and proper-motions, etc.  In addition, the DVO shell
     3212program will be used to generate the color transformations from the
     3213observed data and to perform other tests of the catalog quality.
    31763214
    31773215%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
     
    31913229functions in the operational system, the IPP will make use of Perl as
    31923230the scripting language to provide the required flow-control to tie the
    3193 modules together. \tbd{note that we use C only, not perl for
    3194 scripting}.
     3231modules together.
    31953232
    31963233This approach satisfies the requirement that complicated low-level
     
    35513588from the selected reference catalog.  The observed sources are matched
    35523589to the reference sources, using either a two-point grid search or
    3553 optionally a \tbd{triangle match}.  Once an approximate match is
    3554 found, a linear fit between detector coordinates an projected
    3555 celestial coordinates is attempted.  The projected coordinate system
    3556 may optionally make use of the default telescope distortion model, if
    3557 it is known.  The radius of the match between observed and reference
     3590optionally a triangle match.  Once an approximate match is found, a
     3591linear fit between detector coordinates an projected celestial
     3592coordinates is attempted.  The projected coordinate system may
     3593optionally make use of the default telescope distortion model, if it
     3594is known.  The radius of the match between observed and reference
    35583595sources is reduced to improve the statistics of the match.  This
    35593596anaysis mode is used in the Phase 2 processing.
     
    36103647\subsection{poisub}
    36113648
    3612 Poisub is the image difference analysis program.  \tbd{Paul: please
    3613   flesh this out!}.
     3649Poisub is the image difference analysis program.  \tbd{finish this
     3650discussion}.
    36143651
    36153652\subsection{stac}
     
    36183655same region of the sky.  It consists of two major stages: the warping
    36193656stage and the image combination stage with robust outlier rejection.
    3620 \tbd{Paul: flesh this out!}
     3657\tbd{update / finish this discussion}
    36213658
    36223659\subsection{Command Sequences}
     
    40354072\subsection{IPP Pipelines Overview}
    40364073
     4074\tbd{add the use of Q/A measurements from the IPP CDR Response
     4075document}
     4076
    40374077The IPP as a whole performs all of the image analysis functions
    40384078required by the Pan-STARRS telescopes, including images from the full
     
    43804420header or new exp table?), the exposure is added to the `raw exposure'
    43814421table for images of that type.  The allowed types are `detrend', (all
    4382 bias, dark, flat images), `object', `focus'(??), etc.  (** The
    4383 different tables represent different analysis modes.  This process
    4384 also adds an entry to the exp ID / image file match **).  This process
    4385 also adds all science (OBJECT) exposures to the P1 exposure table (for
    4386 mosaic data) or the P2 chip table (for single detector data).  These
    4387 tables are used to trigger the Phase 1 and Phase 2 analysis stages.
     4422bias, dark, flat images), `object', `focus'(??), etc.  The different
     4423tables represent different analysis modes.  This process also adds an
     4424entry to the exp ID / image file match.  This process also adds all
     4425science (OBJECT) exposures to the P1 exposure table (for mosaic data)
     4426or the P2 chip table (for single detector data).  These tables are
     4427used to trigger the Phase 1 and Phase 2 analysis stages.
    43884428
    43894429\subsection{Phase 1}
     
    45594599rules. 
    45604600
    4561 \note{Phase 4 run can be defined by selecting an observation group, a
     4601\tbd{Phase 4 run can be defined by selecting an observation group, a
    45624602  set of exposures, or a set of rules related to a spatial region (eg,
    45634603  region, time range, and filter}.
    45644604
    4565 \note{Phase 4 discussion (and diagram) needs more work}
     4605\tbd{Phase 4 discussion (and diagram) needs more work}
    45664606
    45674607\subsection{Analysis Version and Recipes}
Note: See TracChangeset for help on using the changeset viewer.