PK ØAþB¤ài>®Y ®Y atpy-0.9.7/format_online.html
It is possible to query online databases and automatically return the results as a Table instance. There are several mechanisms for accessing online catalogs:
An interface to the virtual observatory is provided via the vo module. To list the catalogs available, use the list_catalogs() method from atpy.vo_conesearch:
>>> from atpy.vo_conesearch import list_catalogs
>>> list_catalogs()
USNO-A2
USNO-B1
USNO NOMAD
USNO ACT
A specific catalog can then be queried with a conesearch by specifying a catalog, and the coordinates and radius (in degrees) to search:
>>> t = atpy.Table(catalog='USNO-B1', ra=233.112, dec=23.432, radius=0.3, type='vo_conesearch')
How long this query takes will depend on the speed of your network, the load on the server being queried, and the number of rows in the result. For advanced users, it is also possible to query catalogs not listed by list_catalogs() - for more details, see the Full API for advanced users.
In addition to supporting Virtual Observatory queries, ATpy supports queries to the NASA/IPAC Infrared Science Archive (IRSA). The interface is similar to that of the VO. To list the catalogs available, use the list_catalogs() method from atpy.irsa_service:
>>> from atpy.irsa_service import list_catalogs
>>> list_catalogs()
fp_psc 2MASS All-Sky Point Source Catalog (PSC)
fp_xsc 2MASS All-Sky Extended Source Catalog (XSC)
lga_v2 The 2MASS Large Galaxy Atlas
fp_scan_dat 2MASS All-Sky Survey Scan Info
... ...
The first column is the catalog code used in the query. A specific catalog can then be queried by specifying a query type, a catalog, and additional arguments as required. The different kinds of search are:
Cone: This is a cone search. Requires objstr, a string containing either coordinates or an object name (see here for more information), and radius, with units given by units ('arcsec' by default). For example:
>>> t = atpy.Table('Cone', 'fp_psc', objstr='m13', \
radius=100., type='irsa')
Box: This is a box search. Requires objstr, a string containing either coordinates or an object name (see here for more information), and size in arcseconds. For example:
>>> t = atpy.Table('Box', 'fp_psc', objstr='T Tau', \
size=200., units='deg', type='irsa')
Polygon: This is a polygon search. Requires polygon, which should be a list of tuples of (ra, dec) in decimal degrees:
>>> t = atpy.Table('polygon','fp_psc', \
polygon=[(11.0, 45.0), (12.0, 45.0), (11.5, 46.)], \
type='irsa')
As for the VO query, how long these queries takes will depend on the speed of your network, the load on the IRSA server, and the number of rows in the result.
Note
The following functions should not be called directly - the arguments should be passed to Table()/Table.read() using either type=vo_conesearch or type=irsa.
Query a VO catalog using the STScI vo module
This docstring has been adapted from the STScI vo conesearch module:
catalog [ None | string | VOSCatalog | list ]
May be one of the following, in order from easiest to use to most control:
- None: A database of conesearch catalogs is downloaded from STScI. The first catalog in the database to successfully return a result is used.
- catalog name: A name in the database of conesearch catalogs at STScI is used. For a list of acceptable names, see vo_conesearch.list_catalogs().
- url: The prefix of a url to a IVOA Cone Search Service. Must end in either ? or &.
- A VOSCatalog instance: A specific catalog manually downloaded and selected from the database using the APIs in the STScI vo.vos_catalog module.
- Any of the above 3 options combined in a list, in which case they are tried in order.
- pedantic [ bool ]
- When pedantic is True, raise an error when the returned VOTable file violates the spec, otherwise issue a warning.
- ra [ float ]
- A right-ascension in the ICRS coordinate system for the position of the center of the cone to search, given in decimal degrees.
- dec [ float ]
- A declination in the ICRS coordinate system for the position of the center of the cone to search, given in decimal degrees.
- radius [ float]
- The radius of the cone to search, given in decimal degrees.
- verb [ int ]
- Verbosity, 1, 2, or 3, indicating how many columns are to be returned in the resulting table. Support for this parameter by a Cone Search service implementation is optional. If the service supports the parameter, then when the value is 1, the response should include the bare minimum of columns that the provider considers useful in describing the returned objects. When the value is 3, the service should return all of the columns that are available for describing the objects. A value of 2 is intended for requesting a medium number of columns between the minimum and maximum (inclusive) that are considered by the provider to most typically useful to the user. When the verb parameter is not provided, the server should respond as if verb = 2. If the verb parameter is not supported by the service, the service should ignore the parameter and should always return the same columns for every request.
Additional keyword arguments may be provided to pass along to the server. These arguments are specific to the particular catalog being queried.
Query the NASA/IPAC Infrared Science Archive (IRSA)
Required Arguments:
- spatial [ ‘Cone’ | ‘Box’ | ‘Polygon’ ]
- The type of query to execute
- catalog [ string ]
- One of the catalogs listed by atpy.irsa_service.list_catalogs()
Optional Keyword Arguments:
- objstr [ str ]
- This string gives the position of the center of the cone or box if performing a cone or box search. The string can give coordinates in various coordinate systems, or the name of a source that will be resolved on the server (see here for more details). Required if spatial is ‘Cone’ or ‘Box’.
- radius [ float ]
- The radius for the cone search. Required if spatial is ‘Cone’
- units [ ‘arcsec’ | ‘arcmin’ | ‘deg’ ]
- The units for the cone search radius. Defaults to ‘arcsec’.
- size [ float ]
- The size of the box to search in arcseconds. Required if spatial is ‘Box’.
- polygon [ list of tuples ]
- The list of (ra, dec) pairs, in decimal degrees, outlinining the polygon to search in. Required if spatial is ‘Polygon’
Please note!
Much of ATpy’s functionality has now been incorporated into Astropy, and while we will continue to fix bugs, we are no longer actively developing new features in ATpy , instead focusing our efforts on Astropy. If you are a new user, and do not need the SQL-querying functionality, we recommend using Astropy Tables directly. If you are already using ATpy and are interested in migrating to Astropy, please read our Guide for Migrating to Astropy.
GitHub - Download latest stable version - Report Bugs
ATpy is a high-level Python package providing a way to manipulate tables of astronomical data in a uniform way. The two main features of ATpy are:
In addition, ATpy provides a TableSet class that can be used to contain multiple tables, and supports reading and writing to file/database formats that support this (FITS, VO, and SQL databases).
Finally, ATpy provides support for user-written read/write functions for file/database formats not supported by default. We encourage users to send us custom read/write functions to read commonly used formats, and would be happy to integrate them into the main distribution.
The following example shows how ATpy can be used to read, convert, and write a data file from FITS format to VO, HDF5, IPAC, and SQLite formats:
import atpy
tbl = atpy.Table('some_fits_table_file.fits')
# ATpy will automatically try to detect which type of file you're writing.
tbl.write('new_votable.xml') # VO Table
tbl.write('new_ipactable.tbl') # IPAC table
tbl.write('new_ipactable.hdf5') # HDF5 table
tbl.write('sqlite','new_sqlitetable.db') # SQLite database
# You can easily access and modify data in the table:
tbl.some_column[3] = 4.5
tbl.remove_column('some_other_column')
This is only a small fraction of ATpy’s functionality. We strongly recommend that users read through the documentation, which is available below. For a quick introduction, we recommend the Constructing a table and Accessing Table Data sections. For information about format-specific features, see Supported Formats.
ATpy is developed by Thomas Robitaille and Eli Bressert.
It is often useful to be able to define missing or invalid values in a table. There are currently two ways to do this in ATpy, Null values, and Masking. The preferred way is to use Masking, but this requires at least NumPy 1.4.1 in most cases, and the latest svn version of NumPy for SQL database input/output. Therefore, for version 0.9.4 of ATpy, the default is to use the Null value method. To opt-in to using masked arrays, specify the masked=True argument when creating a Table instance:
>>> t = Table('example.fits.gz', masked=True)
In future, once NumPy 1.5.0 is out, we will switch over to using masked arrays by default, and will slowly phase out the Null value method.
If you want to set the default for masking to be on or off for a whole script, this can be done using the set_masked_default function:
import atpy
atpy.set_masked_default(True)
If you want to set the default for masking on a user-level, create a file named ~/.atpyrc in your home directory, containing:
[general]
masked_default:yes
The set_masked_default function overrides the .atpyrc file, and the masked= argument in Table overrides both the set_masked_default function and the .atpyrc file.
The basic idea behind this method is to specify a special value in each column that will signify missing or invalid data. To specify the Null value for a column, use the null argument in add_column:
>>> t.add_column('time', time, null=-999.)
Following this, if the table is written out to a file or database, this null value will be stored.
This method is generally unreliable, especially for floating point values, and does not allow users to easily distinguish between invalid and missing values.
NumPy supports masked arrays, where specific elements of an array can be properly masked by using a mask - a boolean array. There are several advantages to using this:
To specify the mask of a column, use the mask argument in add_column. To do the equivalent to the example in Null values, use:
>>> t.add_column('time', time, mask=time==-999.)
When writing out to certain file/database formats, a masked value has to be given a specific value - this is called a fill value. To set the fill value, simply use the fill argument when adding data to a column:
>>> t.add_column('time', time, mask=time==-999., fill=-999.)
In the above example, if the table is written out to an IPAC table, the value of -999. will be used for masked values.
Note
When implementing this in ATpy, we discovered a few bugs in the masked structured implementation of NumPy, which have now been fixed. Therefore, we recommend using the latest svn version of NumPy if you want to use masked arrays.
The following pages describe the file formats currently supported, and format-specific options. A full API is also included for advanced users.
Unless stated otherwise below, all table types support the following types:
Note
There are probably as many ASCII table formats as astronomers (if not more). These generally store a single table, and can sometimes include meta-data.
Reading ASCII tables is supported thanks to the asciitable module, which makes it easy to read in arbitrary ASCII files.
By default, several pre-defined formats are available. These include CDS tables (also called Machine-Readable tables), DAOPhot tables, and RDB tables. To read these formats, simply use:
>>> t = atpy.Table('table.mrt', type='mrt')
>>> t = atpy.Table('table.cds', type='cds')
>>> t = atpy.Table('table.phot', type='daophot')
>>> t = atpy.Table('table.rdb', type='rdb')
The type= argument is optional for these formats, if they have appropriate file extensions, but due to the large number of ASCII file formats, it is safer to include it.
ATpy also allows full access to asciitable. If the type='ascii' argument is specified in Table(), all arguments are passed to asciitable.read, and the result is automatically stored in the ATpy Table instance. For more information on the arguments available in asciitable.read, see here.
Note
As for all file formats, the verbose argument can be specified to control whether warning messages are shown when reading (the default is verbose=True), and the overwrite argument can be used when writing to overwrite a file (the default is overwrite=False).
Note
The following functions should not be called directly - the arguments should be passed to Table()/Table.read().
Read data from a CDS table (also called Machine Readable Tables) file
Required Arguments:
- filename: [ string ]
- The file to read the table from
Keyword Arguments are passed to astropy.io.ascii
Read data from a DAOphot table
Required Arguments:
- filename: [ string ]
- The file to read the table from
Keyword Arguments are passed to astropy.io.ascii
Read data from an RDB table
Required Arguments:
- filename: [ string ]
- The file to read the table from
Keyword Arguments are passed to astropy.io.ascii
Read a table from an ASCII file using astropy.io.ascii
Optional Keyword Arguments:
Reader - Reader class (default= BasicReader ) Inputter - Inputter class delimiter - column delimiter string comment - regular expression defining a comment line in table quotechar - one-character string to quote fields containing special characters header_start - line index for the header line not counting comment lines data_start - line index for the start of data not counting comment lines data_end - line index for the end of data (can be negative to count from end) converters - dict of converters data_Splitter - Splitter class to split data columns header_Splitter - Splitter class to split header columns names - list of names corresponding to each data column include_names - list of names to include in output (default=None selects all names) exclude_names - list of names to exlude from output (applied after include_names)
Note that the Outputter argument is not passed to astropy.io.ascii.
See the astropy.io.ascii documentation at http://docs.astropy.org/en/latest/io/ascii/index.html for more details.
Read a table from an ASCII file using astropy.io.ascii
Optional Keyword Arguments:
Writer - Writer class (default= Basic) delimiter - column delimiter string write_comment - string defining a comment line in table quotechar - one-character string to quote fields containing special characters formats - dict of format specifiers or formatting functions names - list of names corresponding to each data column include_names - list of names to include in output (default=None selects all names) exclude_names - list of names to exlude from output (applied after include_names)
See the astropy.io.ascii documentation at http://docs.astropy.org/en/latest/io/ascii/index.html for more details.
The Table class is the basic entity in ATpy. It consists of table data and metadata. The data is stored using a NumPy structured array. The metadata includes units, null values, and column descriptions, as well as comments and keywords.
Data can be stored in the table using many of the NumPy types, including booleans, 8, 16, 32, and 64-bit signed and unsigned integers, 32 and 64-bit floats, and strings. Not all file formats and databases support reading and writing all of these types – for more information, see Supported Formats.
The simplest way to create an instance of the Table is to call the class with no arguments:
>>> t = atpy.Table()
A table can be populated either manually or by reading data from a file or database. Reading data into a table erases previous content. Data can be manually added once a table has been read in from a file.
The read(...) method can be used to read in a table from a file. To date, ATpy supports the following file formats:
Now that ATpy has integrates with asciitable, the following formats are also supported:
- CDS (type=cds or type=mrt)
- DAOPhot (type=daophot)
- RDB (type=rdb)
- Arbitrary ASCII tables (type=ascii)
When reading a table from a file, the only required argument is the filename. For example, to read a VO table called example.xml, the following should be used:
>>> t.read('example.xml')
Auto-detected input type: VO table
The read() method will in most cases correctly identify the format of the file from the extension. As seen above, the default behavior is to specifically tell the user what format is being assumed, but this can be controlled via the verbose argument.
In some cases, read() will fail to determine the input type. In this case, or to override the automatically selected type, the input type can be specified using the type argument:
>>> t.read('example.xml', type='vo')
The read method supports additional file-format-dependent options. These are described in more detail in Supported Formats.
In cases where multiple tables are available in a table file, ATpy will display a message to the screen with instructions of how to specify which table to read in. Alternatively, see Table Sets for information on how to read all tables into a single TableSet instance.
As a convenience, it is possible to create a Table instance and read in data in a single command:
>>> t = Table('example.xml')
Any arguments given to Table are passed on to the read method, so the above is equivalent to:
>>> t = Table()
>>> t.read('example.xml')
As of 0.9.6, it is now possible to specify URLs starting with http:// or ftp:// and the file will automatically be downloaded. Furthermore, it is possible to specify files compressed in gzip or bzip format for all I/O formats.
Reading a table from a database is very similar to reading a table from a file. The main difference is that for databases, the first argument should be the database type, To date, ATpy supports the following database types:
- SQLite (sqlite)
- MySQL (mysql)
- PostGreSQL (postgres)
The remaining arguments depend on the database type. For example, an SQLite database can be read by specifying the database filename:
>>> t.read('sqlite','example.db')
For MySQL and PostGreSQL databases, it is possible to specify the database, table, authentication, and host parameters. The various options are descried in more detail in Supported Formats. As for files, the verbose and type arguments can be used.
As for reading in from files, one can read in data from a database while initializing the Table object:
>>> t = Table('sqlite','example.db')
Note
It is possible to specify a full SQL query using the query argument. Any valid SQL is allowed. If this is used, the table name should nevertheless be specified using the table argument.
It is possible to add columns to an empty or an existing table. Two methods exist for this. The first, add_column, allows users to add an existing array to a column. For example, the following can be used to add a column named time where the variable time_array is a NumPy array:
>>> t.add_column('time', time_array)
The add_column method also optionally takes metadata about the column, such as units, or a description. For example:
>>> t.add_column('time', time_array, unit='seconds')
indicates that the units of the column are seconds. It is also possible to convert the datatype of an array while adding it to a table by using the dtype argument. For example, the following stores the column from the above examples as 32-bit floating point values:
>>> t.add_column('time', time_array, unit='seconds', dtype=np.float32)
In some cases, it is desirable to add an empty column to a table, and populate it element by element. This can be done using the add_empty_column method. The only required arguments for this method are the name and the data type of the column:
>>> t.add_empty_column('id', np.int16)
If the column is the first one being added to an empty table, the shape argument should be used to specify the number of rows. This should either be an integer giving the number of rows, or a tuple in the case of vector columns (see Vector Columns for more details)
As well as using one-dimensional columns is also possible to specify so-called vector columns, which are essentially two-dimensional arrays. Only FITS and VO tables support reading and writing these. The add_column method accepts two-dimensional arrays as input, and uses these to define vector columns. Empty vector columns can be created by using the add_empty_column method along with the shape argument to specify the full shape of the column. This should be a tuple of the form (n_rows, n_elements).
Writing data to files or databases is done through the write method. The arguments to this method are very similar to that of the read data. The only main difference is that the write method can take an overwrite argument that specifies whether or not to overwrite existing files.
Comments and keywords can be added to a table using the add_comment() and add_keyword() methods:
>>> t.add_comment("This is a great table")
>>> t.add_keyword("meaning", 42)
ATpy requires the following:
The following packages are optional, but are required to read/write to certain formats:
The latest stable release of ATpy can be downloaded from PyPI. To install ATpy, use the standard installation procedure:
tar xvzf ATpy-X-X.X.tar.gz
cd ATpy-X.X.X/
python setup.py install
Advanced users wishing to use the latest development (“unstable”) version can check it out with:
git clone git://github.com/atpy/atpy.git
which can then be installed with:
cd atpy
python setup.py install
One of the new features introduced in ATpy 0.9.2 is the ability for users to write their own read/write functions and register them with ATpy. A read or write function needs to satisfy the following requirements:
The first argument should be a Table instance (in the case of a single table reader/writer) or a TableSet instance (in the case of a table set reader/writer)
The function can take any other arguments, with the exception of the keyword arguments verbose and type.
The function should not return anything, but rather should operate directly on the table or table set instance passed as the first argument
If the file format supports masking/null values, the function should take into account that there are two ways to mask values (see Masking and null values). The Table instance has a _masked attribute that specifies whether the user wants a Table with masked arrays, or with a null value. The function should take this into account. For example, in the built-in FITS reader, the table is populated with add_column in the following way:
if self._masked:
self.add_column(name, data, unit=columns.units[i], \
mask=data==columns.nulls[i])
else:
self.add_column(name, data, unit=columns.units[i], \
null=columns.nulls[i])
The reader/writer function can then fill the table by using the Table methods described in Full API for Table class (for a single table reader/writer) or Full API for TableSet class (for a table set reader/writer). In particular, a single table reader will likely contain calls to add_column, while a single table writer will likely contain references to the data attribute of Table.
Once a custom function is available, the user can register it using one of the four ATpy functions:
The API for these functions is of the form (ttype, function, override=True/False), where ttype is the code name for the format (like the build-in fits, vo, ipac, or sql types), function is the actual function to use, and override allows the user to override existing definitions (for example to provide an improved ipac reader).
For example, if a function is defined for reading HDF5 tables, which we can call hdf5.read, then one would first need to register this function after importing atpy:
>>> import atpy
>>> atpy.register_reader('hdf5', hdf5.read)
This type can then be used when reading in a table:
>>> t = atpy.Table('mytable.hdf5', type='hdf5')
It is also possible to register extensions for a specific type using atpy.register_extensions. This function expects a table type and a list of file extensions to associate with it. For example, by setting:
>>> atpy.register_extensions('hdf5', ['hdf5', 'hdf'])
One can then read in an HDF5 table without specifying the type:
>>> t = atpy.Table('mytable.hdf5')
We encourage users to send us examples of reader/writer functions for various formats, and would be happy in future to include readers and writers for commonly used formats in ATpy.
Empty the table
Read in a table from a file/database.
Optional Keyword Arguments (independent of table type):
- verbose: [ True | False ]
- Whether to print out warnings when reading (default is True)
- type: [ string ]
- The read method attempts to automatically guess the file/database format based on the arguments supplied. The type can be overridden by setting this argument.
Write out a table to a file/database.
Optional Keyword Arguments (independent of table type):
- verbose: [ True | False ]
- Whether to print out warnings when writing (default is True)
- type: [ string ]
- The read method attempts to automatically guess the file/database format based on the arguments supplied. The type can be overridden by setting this argument.
Add a comment to the table
Required Argument:
- comment: [ string ]
- The comment to add to the table
Add a keyword/value pair to the table
Required Arguments:
- key: [ string ]
- The name of the keyword
- value: [ string | float | integer | bool ]
- The value of the keyword
Prints a description of the table
Add a column to the table
Required Arguments:
- name: [ string ]
- The name of the column to add
- data: [ numpy array ]
- The column data
Optional Keyword Arguments:
- unit: [ string ]
- The unit of the values in the column
- null: [ same type as data ]
- The values corresponding to ‘null’, if not NaN
- description: [ string ]
- A description of the content of the column
- format: [ string ]
- The format to use for ASCII printing
- dtype: [ numpy type ]
- Numpy type to convert the data to. This is the equivalent to the dtype= argument in numpy.array
- column_header: [ ColumnHeader ]
The metadata from an existing column to copy over. Metadata includes the unit, null value, description, format, and dtype. For example, to create a column ‘b’ with identical metadata to column ‘a’ in table ‘t’, use:
>>> t.add_column('b', column_header=t.columns[a])- before: [ string ]
- Column before which the new column should be inserted
- after: [ string ]
- Column after which the new column should be inserted
- position: [ integer ]
- Position at which the new column should be inserted (0 = first column)
- mask: [ numpy array ]
- An array of booleans, with the same dimensions as the data, indicating whether or not to mask values.
- fill: [ value ]
- If masked arrays are used, this value is used as the fill value in the numpy masked array.
Add an empty column to the table. This only works if there are already existing columns in the table.
Required Arguments:
- name: [ string ]
- The name of the column to add
- dtype: [ numpy type ]
- Numpy type of the column. This is the equivalent to the dtype= argument in numpy.array
Optional Keyword Arguments:
- unit: [ string ]
- The unit of the values in the column
- null: [ same type as data ]
- The values corresponding to ‘null’, if not NaN
- description: [ string ]
- A description of the content of the column
- format: [ string ]
- The format to use for ASCII printing
- column_header: [ ColumnHeader ]
The metadata from an existing column to copy over. Metadata includes the unit, null value, description, format, and dtype. For example, to create a column ‘b’ with identical metadata to column ‘a’ in table ‘t’, use:
>>> t.add_column('b', column_header=t.columns[a])- shape: [ tuple ]
- Tuple describing the shape of the empty column that is to be added. If a one element tuple is specified, it is the number of rows. If a two element tuple is specified, the first is the number of rows, and the second is the width of the column.
- before: [ string ]
- Column before which the new column should be inserted
- after: [ string ]
- Column after which the new column should be inserted
- position: [ integer ]
- Position at which the new column should be inserted (0 = first column)
Remove several columns from the table
Required Argument:
- remove_names: [ list of strings ]
- A list containing the names of the columns to remove
Keep only specific columns in the table (remove the others)
Required Argument:
- keep_names: [ list of strings ]
- A list containing the names of the columns to keep. All other columns will be removed.
Rename a column from the table
Require Arguments:
- old_name: [ string ]
- The current name of the column.
- new_name: [ string ]
- The new name for the column
Set the name of the table column that should be used as a unique identifier for the record. This is the same as primary keys in SQL databases. A primary column cannot contain NULLs and must contain only unique quantities.
Required Arguments:
- key: [ string ]
- The column to use as a primary key
Sort the table according to one or more keys. This operates on the existing table (and does not return a new table).
Required arguments:
- keys: [ string | list of strings ]
- The key(s) to order by
Returns a single row
Required arguments:
- row_number: [ integer ]
- The row number (the first row is 0)
Optional Keyword Arguments:
- python_types: [ True | False ]
- Whether to return the row elements with python (True) or numpy (False) types.
Select specific rows from the table and return a new table instance
Required Argument:
- row_ids: [ list | np.int array ]
- A python list or numpy array specifying which rows to select, and in what order.
Returns:
A new table instance, containing only the rows selected
Select matching rows from the table and return a new table instance
Required Argument:
- mask: [ np.bool array ]
- A boolean array with the same length as the table.
Returns:
A new table instance, containing only the rows selected
Note
IPAC tables are an ASCII table that can contain a single table. The format can contain meta-data that consists of keyword values and comments (analogous to FITS files), and the column headers are separated by pipe (|) symbols that indicate the position of the columns.
IPAC tables are natively supported in ATpy (no additional module is required). Reading IPAC tables is straightforward:
>>> t = atpy.Table('table.tbl')
and writing a table out in IPAC format is equally easy:
>>> t.write('table.tbl')
IPAC tables can have three different definitions with regard to the alignment of the columns with the pipe symbols in the header. The definition to use is controlled by the definition argument. The definitions are:
The default is definition=3.
Note
As for all file formats, the verbose argument can be specified to control whether warning messages are shown when reading (the default is verbose=True), and the overwrite argument can be used when writing to overwrite a file (the default is overwrite=False).
Note
The following functions should not be called directly - the arguments should be passed to Table()/Table.read() and Table.write() respectively.
Read a table from a IPAC file
Required Arguments:
- filename: [ string ]
- The IPAC file to read the table from
Optional Keyword Arguments:
definition: [ 1 | 2 | 3 ]
The definition to use to read IPAC tables:
- 1: any character below a pipe symbol belongs to the
- column on the left, and any characters below the first pipe symbol belong to the first column.
- 2: any character below a pipe symbol belongs to the
- column on the right.
- 3: no characters should be present below the pipe
- symbols (default).
smart_typing: [ True | False ]
Whether to try and save memory by using the smallest integer type that can contain a column. For example, a column containing only values between 0 and 255 can be stored as an unsigned 8-bit integer column. The default is false, so that all integer columns are stored as 64-bit integers.
Write the table to an IPAC file
Required Arguments:
- filename: [ string ]
- The IPAC file to write the table to
Please activate JavaScript to enable the search functionality.
From here you can search these documents. Enter your search words into the box below and click "search". Note that the search function will automatically search for all of the words. Pages containing fewer words won't appear in the result list.