Commit dd534f055bbb1e7dd925af3cf1ec2ea76cbc8e84

Authored by Nathanael Jourdane
1 parent ba941570
Exists in master

Improve documentation

converter/usage.md 0 → 100644
@@ -0,0 +1,143 @@ @@ -0,0 +1,143 @@
  1 +# NetCDF to CDF converter for AMDADB
  2 +
  3 +- File: [./converter/nc2cdf.py](./converter/nc2cdf.py)
  4 +- Python interpreter: 2.7 or 3.6
  5 +
  6 +## Converting a Net-CDF file
  7 +
  8 +### Without output path:
  9 +
  10 +Convert a NetCDF file and save the CDF file in a temp directory, then display its path:
  11 +
  12 +**CLI usage**
  13 +
  14 +```bash
  15 +./converter/nc2cdf.py <input_netcdf_file>
  16 +```
  17 +
  18 +**Python usage**
  19 +
  20 +```python
  21 +from nc2cdf import NetCdf
  22 +netcdf = NetCdf('<input_netcdf_file>')
  23 +netcdf.get_cdf()
  24 +print('CDF path: ' + netcdf.get_cdf_path())
  25 +```
  26 +
  27 +- `<input_netcdf_file>`: the NetCDF file you want to convert to CDF (can be a gzip archive containing the CDF file).
  28 +
  29 +> Working example:
  30 +>
  31 +> ```bash
  32 +> ./converter/nc2cdf.py examples/skr151150000.nc.gz
  33 +> File stored in "/tmp/skr151150000.cdf".
  34 +> ```
  35 +
  36 +### With output path
  37 +
  38 +Convert a Net-CDF file and save the CDF file it in the specified path:
  39 +
  40 +**CLI usage**
  41 +
  42 +```bash
  43 +./nc2cdf.py <input_netcdf_file> <output_cdf_file>
  44 +```
  45 +
  46 +**Python usage**
  47 +
  48 +```python
  49 +from nc2cdf import NetCdf
  50 +netcdf = NetCdf('<input_netcdf_file>')
  51 +netcdf.get_cdf('<output_cdf_file>')
  52 +```
  53 +
  54 +- `<input_netcdf_file>`: the NetCDF file you want to convert (can be a gzip archive containing the CDF file);
  55 +- `<output_cdf_file>`: the path where you want to save the CDF file.
  56 +
  57 +> Working example:
  58 +>
  59 +> ```bash
  60 +> ./converter/nc2cdf.py examples/skr151150000.nc.gz ./skr151150000.ncf
  61 +> ```
  62 +
  63 +## Describing a NetCDf file
  64 +
  65 +Display information about a Net-CDF file, such as global attributes and variables information:
  66 +
  67 +**CLI usage**
  68 +
  69 +```bash
  70 +./nc2cdf.py -i <input_netcdf_file>
  71 +```
  72 +
  73 +**Python usage**
  74 +
  75 +```python
  76 +from nc2cdf import NetCdf
  77 +netcdf = NetCdf('<input_netcdf_file>')
  78 +netcdf.describe()
  79 +```
  80 +
  81 +- `<input_netcdf_file>`: the NetCDF file you want to display (can be a gzip archive containing the CDF file).
  82 +
  83 +> Working example:
  84 +>
  85 +> ```bash
  86 +> ./converter/nc2cdf.py -i examples/skr151150000.nc.gz
  87 +> == Time ==
  88 +> - numpy type: |S1 ^ 2
  89 +> - dimension(s): Time, TimeLength
  90 +> - size: 481x17 = 8177
  91 +> == Received ==
  92 +> - numpy type: float32 ^ 2
  93 +> - dimension(s): Time, Data
  94 +> - size: 481x4 = 1924
  95 +> - Attributes:
  96 +> - Order: RH 100-400kHz; LH 100-400kHz; RH 10-1000kHz; RH 10-1000kHz
  97 +> - Units: W/m^2
  98 +> == Emitted ==
  99 +> - numpy type: float32 ^ 2
  100 +> - dimension(s): Time, Data
  101 +> - size: 481x4 = 1924
  102 +> - Attributes:
  103 +> - Order: RH 100-400kHz; LH 100-400kHz; RH 10-1000kHz; RH 10-1000kHz
  104 +> - Units: W/sr
  105 +> == RH ==
  106 +> - numpy type: float32 ^ 2
  107 +> - dimension(s): Time, Spec
  108 +> - size: 481x48 = 23088
  109 +> - Attributes:
  110 +> - Units: W/m^2/Hz
  111 +> == VR ==
  112 +> - numpy type: float32 ^ 2
  113 +> - dimension(s): Time, Spec
  114 +> - size: 481x48 = 23088
  115 +> - Attributes:
  116 +> - desc: circular polarization degree; valid range: -1.1 - -0.2
  117 +> == LH ==
  118 +> - numpy type: float32 ^ 2
  119 +> - dimension(s): Time, Spec
  120 +> - size: 481x48 = 23088
  121 +> - Attributes:
  122 +> - Units: W/m^2/Hz
  123 +> == VL ==
  124 +> - numpy type: float32 ^ 2
  125 +> - dimension(s): Time, Spec
  126 +> - size: 481x48 = 23088
  127 +> - Attributes:
  128 +> - desc: circular polarization degree; valid range: -1.1 - -0.2
  129 +> == StartTime ==
  130 +> - numpy type: |S1 ^ 1
  131 +> - dimension(s): TimeLength
  132 +> - size: 17 = 17
  133 +> - values: '2015115000000000', ...
  134 +> == StopTime ==
  135 +> - numpy type: |S1 ^ 1
  136 +> - dimension(s): TimeLength
  137 +> - size: 17 = 17
  138 +> - values: '2015115235959000', ...
  139 +> == Global attributes ==
  140 +> - Source: http://www.lesia.obspm.fr/kronos/
  141 +> - note: Only fluxes and powers with |V| > 0.2 are taken into account
  142 +> - Created: Wed Jul 22 11:09:30 2015
  143 +> ```
install/dependencies.md 0 → 100644
@@ -0,0 +1,110 @@ @@ -0,0 +1,110 @@
  1 +# Project dependencies
  2 +
  3 +## Description
  4 +
  5 +This project requires these following dependencies.
  6 +
  7 +### NetCDF4
  8 +
  9 +- [NetCDF](https://www.unidata.ucar.edu/software/netcdf/) is C library to read and edit NetCDF files;
  10 +- [NetCDF4](https://github.com/Unidata/netcdf4-python) is a Python wrapper for NetCDF, which requires the NetCDF
  11 +library, used here to read NetCDF files.
  12 +
  13 +-> Documentation is available [here](http://unidata.github.io/netcdf4-python/).
  14 +
  15 +### pycdf
  16 +
  17 +- [SpacePy](http://pythonhosted.org/SpacePy/index.html) is a python package for space sciences;
  18 +- [pycdf](http://pythonhosted.org/SpacePy/pycdf.htm) is a SpacePy sub-package used to read and write CDF files.
  19 +
  20 +-> Documentation is available [here](http://pythonhosted.org/SpacePy/pycdf.htm).
  21 +
  22 +### Sphinx
  23 +
  24 +Sphinx is a documentation generator for Python projects.
  25 +
  26 +-> Documentation is available [here](http://www.sphinx-doc.org/en/stable/).
  27 +
  28 +## Installing environment and dependencies
  29 +
  30 +Here we will install dependencies in Python environments.
  31 +
  32 +### Case 1: If you have NetCDF installed on your machine
  33 +
  34 +You can use pip and virtualenv:
  35 +
  36 +```bash
  37 +pip install virtualenv # installing virtualenv
  38 +virtualenv -p python3 nc2cdf # creating virtualenv
  39 +source nc2cdf/bin/activate # (or ". nc2cdf/bin/activate.fish" on Fish shells) activating virtualenv
  40 +pip install -r install/pip_req_nc2cdf.txt # installing dependencies
  41 +```
  42 +
  43 +Then, each time you need to run one of the script in this project, activate the environment first:
  44 +
  45 +```bash
  46 +source nc2cdf/bin/activate
  47 +```
  48 +
  49 +And you can deactivate it with:
  50 +
  51 +```bash
  52 +deactivate
  53 +```
  54 +
  55 +### Case 2: If you don't have NetCDF installed on your machine
  56 +
  57 +The easier way is to use [Anaconda](https://docs.continuum.io/), which is a tool to install compiled Python dependencies
  58 + in environments.
  59 +
  60 +First, [install Anaconda3](https://docs.continuum.io/anaconda/install).
  61 +
  62 +The installer may propose you to add Anaconda to your system path, but this will set the Anaconda Python interpreter as
  63 +the default Python interpreter,
  64 +which might be a problem. I recommend to add an alias which both add Anaconda to your path and source the environment.
  65 +Edit your system startup file (usually `.bashrc`):
  66 +
  67 +```bash
  68 +alias conda3="export PATH=$HOME/.anaconda3/bin:$PATH && source activate"
  69 +```
  70 +
  71 +or on Fish shell:
  72 +
  73 +```bash
  74 +alias conda3="set PATH $HOME/.anaconda3/bin/ $PATH; and source $HOME/.anaconda3/etc/fish/conf.d/conda.fish; and activate"
  75 +```
  76 +
  77 +> **Note:** this procedure will also allow you to manage several Anaconda versions (ie. 2 and 3) by using several
  78 +> aliases.
  79 +
  80 +Now, create the environment (required only for the first time), supposing the Anaconda installation path is
  81 +`$HOME/.anaconda3/`:
  82 +
  83 +```bash
  84 +export PATH=$HOME/.anaconda3/bin:$PATH # or "set PATH $HOME/.anaconda3/bin/ $PATH" on Fish shells
  85 +conda create -f install/conda_env_nc2cdf.yml
  86 +```
  87 +
  88 +Then, each time you need to run one of the script in this project, activate the environment first:
  89 +
  90 +```bash
  91 +conda3 nc2cdf
  92 +```
  93 +
  94 +And you can deactivate it with:
  95 +
  96 +```bash
  97 +deactivate
  98 +```
  99 +
  100 +### That's all!
  101 +
  102 +You can now run all scripts in this project:
  103 +
  104 +```
  105 +./create_granules.py
  106 +```
  107 +
  108 +## Build the documentation
  109 +
  110 +To be completed
0 \ No newline at end of file 111 \ No newline at end of file
1 # CDF tools 1 # CDF tools
2 2
3 -## NetCDF to CDF converter for AMDADB 3 +This project contains a set of files used to publish EPN-TAP compliant planetary data for AMDADB.
4 4
5 -- File: [nc2cdf.py](./nc2cdf.py)  
6 -- Python interpreter: 3.6 5 +This data can be accessed through:
  6 +- the [VESPA client](http://vespa.obspm.fr/planetary/data/epn/query/all/);
  7 +- ADQL queries from the [DaCHS server](http://amda-epntap.irap.omp.eu/__system__/adql/query/form);
  8 +- or all clients implementing the EPN-TAP protocol, such as [AMDADB](http://amda.cdpp.eu/),
  9 +[CASSIS](http://cassis.irap.omp.eu/) or [3Dview](http://3dview.cdpp.eu/).
7 10
8 -### CLI usage 11 +## Content
9 12
10 -#### Converting a Net-CDF file: 13 +Projects:
11 14
12 -Convert the NetCDF file, save it in a temp directory, then display its path: 15 +- [converter](./converter): the [NetCDF-to-CDF converter](./converter/usage.md);
  16 +- [dataset_getter](./dataset_getter): download SPASE datasets and generate granules files;
  17 +- [DaCHS](./DaCHS): files required to publish granules on DacHS server;
13 18
14 -```bash  
15 -./nc2cdf.py path/to/input_file.nc.gz  
16 -``` 19 +Other directories:
17 20
18 -Convert a Net-CDF file and save it in the specified path. 21 +- [doc](./doc): documentation site for this project;
  22 +- [install](./install): installation files (to create the Python environment, build the documentation, etc.);
  23 +- [examples](./examples): some example files (used to learn or test some scripts);
  24 +- [specs](./specs): data model specifications (for `SPASE` or `CDF ISTP`);
  25 +- [readme.md](readme.md): this file.
19 26
20 -```bash  
21 -./nc2cdf.py path/to/input_file.nc.gz path/to/output_file.cdf  
22 -``` 27 +After executing some script, the local folder may contain additional files:
23 28
24 -**Note:** If the specified input file is a gzip archive, it will be automatically extracted in a temp directory before the conversion. 29 +- `nc2cdf`: the Python environment, generated by `virtualenv`;
  30 +- `doc_source`: sources files for the documentation, generated by `Sphinx`;
  31 +- `DATA/CDPP`: SPASE dataset files, downloaded (or generated in the case of `DATA/CDPP/Granules`) by
  32 +`create_granules.py`;
  33 +- `log`: log repository, generated by several scripts;
  34 +- `DaCHS/amdadb_db.sql`: the PSQL script which insert epn-tap granules on the DaCHS server, generated by
  35 +`./DaCHS/build_BDD.py`.
25 36
26 -#### Describing a NetCDf file:  
27 -  
28 -```bash  
29 -./nc2cdf.py -i path/to/file.nc.gz  
30 -```  
31 -  
32 -This display information about a Net-CDF file (such as global attributes and variables information).  
33 -  
34 -### Python usage  
35 -  
36 -```python  
37 -import nc2cdf  
38 -  
39 -netcdf = NetCdf('path/to/input_file.nc.gz')  
40 -  
41 -netcdf.describe()  
42 -  
43 -netcdf.get_cdf()  
44 -print('CDF path: ' + netcdf.get_cdf_path())  
45 -  
46 -netcdf.get_cdf('path/to/output_file.cdf')  
47 -```  
48 -  
49 -## Dependencies  
50 -  
51 -- NetCDF4  
52 -  
53 -[NetCDF](https://www.unidata.ucar.edu/software/netcdf/) is C library to read and edit NetCDF files.  
54 -  
55 -[NetCDF4](https://github.com/Unidata/netcdf4-python) is a Python wrapper for NetCDF, which requires the NetCDF library, used here to read NetCDF files.  
56 -  
57 -Documentation is available [here](http://unidata.github.io/netcdf4-python/).  
58 -  
59 -- pycdf  
60 -  
61 -[SpacePy](http://pythonhosted.org/SpacePy/index.html) is a python package for space sciences, used here to write CDF files.  
62 -  
63 -Documentation of the package spacepy.pycdf is available [here](http://pythonhosted.org/SpacePy/pycdf.htm).  
64 -  
65 -### Installing the Python environment and dependencies  
66 -  
67 -We will install dependencies in Python environments.  
68 -  
69 -#### Case 1: If you have NetCDF installed on your machine  
70 -  
71 -You can use pip and virtualenv:  
72 -  
73 -```bash  
74 -pip install virtualenv  
75 -virtualenv -p python3 nc2cdf  
76 -source nc2cdf/bin/activate # Or ". nc2cdf/bin/activate.fish" on Fish terms  
77 -pip install -r pip_req_nc2cdf.txt  
78 -```  
79 -  
80 -#### Case 2: If you don't have NetCDF installed on your machine  
81 -  
82 -The easier way is to use [Anaconda](https://docs.continuum.io/), which is a tool to install compiled Python dependencies in environments.  
83 -  
84 -First, [install Anaconda3](https://docs.continuum.io/anaconda/install).  
85 -  
86 -Then edit your system startup file:  
87 -  
88 -I recommend to add an alias which set the Python path. In this way the Anaconda Python will not be used by default and you can easily deal with multiple Anaconda versions.  
89 -  
90 -Add at the end of your `~/.bashrc`:  
91 -  
92 -```bash  
93 -alias conda3="set PATH $HOME/.anaconda3/bin/ $PATH"  
94 -```  
95 -  
96 -Or on Fish terms (`~/.config/omf/init.fish`):  
97 -  
98 -```bash  
99 -alias conda3="set PATH $HOME/.anaconda3/bin/ $PATH; and source $HOME/.anaconda3/etc/fish/conf.d/conda.fish  
100 -```  
101 -  
102 -Now create the environment:  
103 -  
104 -```bash  
105 -conda3  
106 -conda create -f conda_env_nc2cdf.yml  
107 -```  
108 -  
109 -To finish, activate the Conda environment:  
110 -  
111 -```bash  
112 -source activate nc2cdf # or only "activate nc2cdf" on Fish terms  
113 -```  
114 -  
115 -You can now use the converter.  
116 -  
117 -### Licence 37 +## Licence
118 38
119 - License: [GPLv3](https://www.gnu.org/licenses/gpl-3.0.html); 39 - License: [GPLv3](https://www.gnu.org/licenses/gpl-3.0.html);
120 - Credits: CNRS/IRAP; 40 - Credits: CNRS/IRAP;
specs/CDF_spec.md
@@ -10,7 +10,8 @@ See [the ISTP guidelines for global attributes](https://spdf.gsfc.nasa.gov/istp_ @@ -10,7 +10,8 @@ See [the ISTP guidelines for global attributes](https://spdf.gsfc.nasa.gov/istp_
10 10
11 ### Acknowledgement 11 ### Acknowledgement
12 12
13 -Value of node `NumericalData/ResourceHeader/Acknowledgement` from SPASE file *NumericalData* corresponding to the granule. 13 +Value of node `NumericalData/ResourceHeader/Acknowledgement` from SPASE file *NumericalData* corresponding to the
  14 +granule.
14 15
15 > **Ex:** 16 > **Ex:**
16 > 17 >
@@ -39,7 +40,8 @@ Instrument name @@ -39,7 +40,8 @@ Instrument name
39 ### Discipline 40 ### Discipline
40 41
41 - If target name is `Sun` or `Earth` : ***Space Physics*** 42 - If target name is `Sun` or `Earth` : ***Space Physics***
42 -- otherwise, look for node value of `NumericalData/ObservedRegion` from the *NumericalData* SPASE file corresponding to the granule: 43 +- otherwise, look for node value of `NumericalData/ObservedRegion` from the *NumericalData* SPASE file corresponding
  44 +to the granule:
43 - if it contains `Magnetosphere`: ***Space Physics>Magnetospheric Science***; 45 - if it contains `Magnetosphere`: ***Space Physics>Magnetospheric Science***;
44 - if it contains `NearSurface`: ***Space Physics>Ionospheric Science***; 46 - if it contains `NearSurface`: ***Space Physics>Ionospheric Science***;
45 - if it contains `Heliosphere`: ***Space Physics>Interplanetary Studies***. 47 - if it contains `Heliosphere`: ***Space Physics>Interplanetary Studies***.
@@ -50,7 +52,8 @@ Instrument name @@ -50,7 +52,8 @@ Instrument name
50 52
51 ### Generation_date 53 ### Generation_date
52 54
53 -Value of node `NumericalData/ResourceHeader/ReleaseDate` from the *NumericalData* SPASE file corresponding to the granule, formated as *yyyymmdd*. 55 +Value of node `NumericalData/ResourceHeader/ReleaseDate` from the *NumericalData* SPASE file corresponding to the
  56 +granule, formated as *yyyymmdd*.
54 57
55 > **Ex:** 58 > **Ex:**
56 > 59 >
@@ -70,7 +73,8 @@ Value of node `Instrument/InstrumentType` from the *Instrument* SPASE file corre @@ -70,7 +73,8 @@ Value of node `Instrument/InstrumentType` from the *Instrument* SPASE file corre
70 73
71 ### LINK_TEXT 74 ### LINK_TEXT
72 75
73 -Value of node `NumericalData/ResourceHeader/AlternateName` from the *NumericalData* SPASE file corresponding to the granule **+** ` data are available at `. 76 +Value of node `NumericalData/ResourceHeader/AlternateName` from the *NumericalData* SPASE file corresponding to the
  77 +granule **+** ` data are available at `.
74 78
75 > **Ex:** 79 > **Ex:**
76 > 80 >
@@ -100,15 +104,18 @@ Dataset name @@ -100,15 +104,18 @@ Dataset name
100 104
101 ### Logical_source_description 105 ### Logical_source_description
102 106
103 -Value of node `NumericalData/ResourceHeader/Description` from the *NumericalData* SPASE file corresponding to the granule. 107 +Value of node `NumericalData/ResourceHeader/Description` from the *NumericalData* SPASE file corresponding to the
  108 +granule.
104 109
105 > **Ex:** 110 > **Ex:**
106 > 111 >
107 -> This instrument (CIS: Cluster Ion Spectrometry) is capable of obtaining full 3D ion distributions with high time resolution (in one spacecraft spin) and mass-per-charge resolution. The experiment consists of [...] 112 +> This instrument (CIS: Cluster Ion Spectrometry) is capable of obtaining full 3D ion distributions with high time
  113 +> resolution (in one spacecraft spin) and mass-per-charge resolution. The experiment consists of [...]
108 114
109 ### Mission_group 115 ### Mission_group
110 116
111 -- Last element of the value of node `Observatory/ObservatoryGroupID` from the *Observatory* SPASE file corresponding to the granule. 117 +- Last element of the value of node `Observatory/ObservatoryGroupID` from the *Observatory* SPASE file corresponding to
  118 +the granule.
112 - if the node doesn't exist: value of node `Observatory/ResourceName`. 119 - if the node doesn't exist: value of node `Observatory/ResourceName`.
113 120
114 > **Ex:** 121 > **Ex:**
@@ -137,7 +144,8 @@ Value of node `Person/OrganisationName` from the *Person* SPASE file correspondi @@ -137,7 +144,8 @@ Value of node `Person/OrganisationName` from the *Person* SPASE file correspondi
137 144
138 ### PI_name 145 ### PI_name
139 146
140 -Value of node `NumericalData/Contact/PersonID` where `Role` node value is `PrincipalInvestigator`, from the *NumericalData* SPASE file corresponding to the granule. 147 +Value of node `NumericalData/Contact/PersonID` where `Role` node value is `PrincipalInvestigator`, from the
  148 +*NumericalData* SPASE file corresponding to the granule.
141 149
142 > **Ex:** 150 > **Ex:**
143 > 151 >
@@ -159,12 +167,15 @@ A list with: @@ -159,12 +167,15 @@ A list with:
159 167
160 System Use Policy 168 System Use Policy
161 169
162 - Thank you for acknowledging the use of AMDA in publications with wording like "Data analysis was performed with the AMDA science analysis system provided by the Centre de Données de la Physique des Plasmas (CDPP) supported by CNRS, CNES, Observatoire de Paris and Université Paul Sabatier, Toulouse." 170 + Thank you for acknowledging the use of AMDA in publications with wording like "Data analysis was performed with the
  171 + AMDA science analysis system provided by the Centre de Données de la Physique des Plasmas (CDPP) supported by CNRS,
  172 + CNES, Observatoire de Paris and Université Paul Sabatier, Toulouse."
163 173
164 Data use policy from originating <NAME> data center: <URL> 174 Data use policy from originating <NAME> data center: <URL>
165 175
166 where : 176 where :
167 -- <NAME> is the value of node `NumericalData/ProviderName` from the SPASE file *NumericalData* corresponding to the granule, or CDPP if the value doesn't exist. 177 +- <NAME> is the value of node `NumericalData/ProviderName` from the SPASE file *NumericalData* corresponding to the
  178 +granule, or CDPP if the value doesn't exist.
168 - and <URL> is kept from this dictionary (according to <NAME>). 179 - and <URL> is kept from this dictionary (according to <NAME>).
169 180
170 - CDPP: http://amda.irap.omp.eu/help/licenceAction.do.html 181 - CDPP: http://amda.irap.omp.eu/help/licenceAction.do.html
@@ -184,9 +195,10 @@ where : @@ -184,9 +195,10 @@ where :
184 > **Ex:** 195 > **Ex:**
185 > 196 >
186 > System Use Policy 197 > System Use Policy
187 ->  
188 -> Thank you for acknowledging the use of AMDA in publications with wording like "Data analysis was performed with the AMDA science analysis system provided by the Centre de Données de la Physique des Plasmas (CDPP) supported by CNRS, CNES, Observatoire de Paris and Université Paul Sabatier, Toulouse."  
189 -> 198 +>
  199 +> Thank you for acknowledging the use of AMDA in publications with wording like "Data analysis was performed with
  200 +> the AMDA science analysis system provided by the Centre de Données de la Physique des Plasmas (CDPP) supported by
  201 +> CNRS, CNES, Observatoire de Paris and Université Paul Sabatier, Toulouse."
190 > Data use policy from originating CDPP data center: http://amda.irap.omp.eu/help/licenceAction.do.html 202 > Data use policy from originating CDPP data center: http://amda.irap.omp.eu/help/licenceAction.do.html
191 203
192 ### Software_version 204 ### Software_version
@@ -203,11 +215,13 @@ The mission name @@ -203,11 +215,13 @@ The mission name
203 215
204 ### TEXT 216 ### TEXT
205 217
206 -The value of node `NumericalData/ResourceHeader/InformationURL/Name` **+** `, ` **+** the value of node `NumericalData/ResourceHeader/InformationURL/URL`, from the SPASE file *NumericalData* corresponding to the granule. 218 +The value of node `NumericalData/ResourceHeader/InformationURL/Name` **+** `, ` **+** the value of node
  219 +`NumericalData/ResourceHeader/InformationURL/URL`, from the SPASE file *NumericalData* corresponding to the granule.
207 220
208 > **Ex:** 221 > **Ex:**
209 > 222 >
210 -> NSSDC Master Catalog listing for Cluster II Rumba Cluster Ion Spectrometry (CIS), http://nssdc.gsfc.nasa.gov/nmc/experimentDisplay.do?id=2000-045A-02 223 +> NSSDC Master Catalog listing for Cluster II Rumba Cluster Ion Spectrometry (CIS),
  224 +> http://nssdc.gsfc.nasa.gov/nmc/experimentDisplay.do?id=2000-045A-02
211 225
212 ### Time_resolution 226 ### Time_resolution
213 227
@@ -219,7 +233,8 @@ The value of node `NumericalData/ResourceHeader/InformationURL/Name` **+** `, ` @@ -219,7 +233,8 @@ The value of node `NumericalData/ResourceHeader/InformationURL/Name` **+** `, `
219 233
220 ### TITLE 234 ### TITLE
221 235
222 -The value of node `NumericalData/ResourceHeader/AlternateName`, from the SPASE file *NumericalData* corresponding to the granule. 236 +The value of node `NumericalData/ResourceHeader/AlternateName`, from the SPASE file *NumericalData* corresponding to the
  237 +granule.
223 238
224 > **Ex:** 239 > **Ex:**
225 > 240 >
@@ -251,7 +266,8 @@ The value of node `NumericalData/ResourceHeader/AlternateName`, from the SPASE f @@ -251,7 +266,8 @@ The value of node `NumericalData/ResourceHeader/AlternateName`, from the SPASE f
251 266
252 ### PDS_Observation_type 267 ### PDS_Observation_type
253 268
254 -Look for the child node in `NumericalData/Parameter` node, from the SPASE file *NumericalData* corresponding to the granule. 269 +Look for the child node in `NumericalData/Parameter` node, from the SPASE file *NumericalData* corresponding to the
  270 +granule.
255 271
256 It is one of: `Field`, `Particules`, `Wave`, `Mixed`, `Support`. 272 It is one of: `Field`, `Particules`, `Wave`, `Mixed`, `Support`.
257 273
@@ -276,7 +292,8 @@ Add other epn-core parameters, with `VESPA_` prefix, only if: @@ -276,7 +292,8 @@ Add other epn-core parameters, with `VESPA_` prefix, only if:
276 292
277 See [the ISTP guidelines for variables](https://spdf.gsfc.nasa.gov/istp_guide/variables.html). 293 See [the ISTP guidelines for variables](https://spdf.gsfc.nasa.gov/istp_guide/variables.html).
278 294
279 -To create new variables with [spacepy.pycdf](http://pythonhosted.org/SpacePy/pycdf.html), use [cdf.new()](http://pythonhosted.org/SpacePy/autosummary/spacepy.pycdf.CDF.html#spacepy.pycdf.CDF.new) and not `attr[]`. 295 +To create new variables with [spacepy.pycdf](http://pythonhosted.org/SpacePy/pycdf.html), use
  296 +[cdf.new()](http://pythonhosted.org/SpacePy/autosummary/spacepy.pycdf.CDF.html#spacepy.pycdf.CDF.new) and not `attr[]`.
280 297
281 To name variables, use the `NumericalData/Parameter/ParameterKey` parameter from the SPASE file. 298 To name variables, use the `NumericalData/Parameter/ParameterKey` parameter from the SPASE file.
282 299
@@ -304,7 +321,8 @@ type : CDF_CHAR @@ -304,7 +321,8 @@ type : CDF_CHAR
304 321
305 ### LABL_PTR_1 322 ### LABL_PTR_1
306 323
307 -Note: Only if the node value of `NumericalData/Parameter/RenderingHints` is `time series` and the node `NumericalData/Parameter/Structure` exist. 324 +Note: Only if the node value of `NumericalData/Parameter/RenderingHints` is `time series` and the node
  325 +`NumericalData/Parameter/Structure` exist.
308 326
309 Note 2: `LABL_PTR_1` and `DEPEND_1` can not are both presents. 327 Note 2: `LABL_PTR_1` and `DEPEND_1` can not are both presents.
310 328