diff --git a/latex/AlborexData_ESSD.tex b/latex/AlborexData_ESSD.tex
index 05e1aa8d2f41909a56225f77e9ef329e8cafea86..4436bdd286baab4d15ba41cbb6de8a2dec9905fa 100644
--- a/latex/AlborexData_ESSD.tex
+++ b/latex/AlborexData_ESSD.tex
@@ -121,12 +121,10 @@ The AlborEx dataset can be used for DA experiments, for example assimilating the
 
 Finally, other observing and modeling programs in the Mediterranean Sea can also benefit from the present dataset, for instance the Coherent Lagrangian Pathways from the Surface Ocean to Interior (CALYPSO) in the Southwest Mediterranean Sea \citep{JOHNSTON2018}. Similarly to AlborEx, CALYPSO strives to study a strong ocean front front and the vertical exchanges taking place in the area of interest. For details on the mission objectives, see \url{https://www.onr.navy.mil/Science-Technology/Departments/Code-32/All-Programs/Atmosphere-Research-322/Physical-Oceanography/CALYPSO-DRI}, last accessed December 17, 2018.
 
-\subsection{Data processing\label{sec:processing}}
+\subsection{Processing levels\label{sec:processing}}
 
 For each of the platform described in Sec.~\ref{sec:mission}, different processing are performed with the objective to turn raw data into quality-controlled, standardised data directly usable by scientists and experts. Specific conventions for data managed by SOCIB are explained below.
 
-\subsubsection{Processing levels}
-
 All the data provided by SOCIB are available in different so-called processing levels, ranging from 0 (raw data) to 2 (gridded data). The files are organized by \textit{deployments}, where a deployment is defined as an event initiated when an instrument is put at sea and finished once the instrument is recovered from sea. Table~\ref{tab:deployment} summarizes the deployments performed during the experiment and the available processing levels.
 
 \begin{description}
@@ -156,11 +154,14 @@ Profiling floats	& 3						& 2014-05-25	& beyond the experiment & \checkmark 	& \
 \belowtable{} % Table Footnotes
 \end{table*}
 
-\subsubsection{Quality control}
+\subsection{Quality control}
 
 Automated data QC is part of the processing routine of SOCIB Data Center: most of the datasets provided with this paper come with a set of flags that reflect the quality of the measurements, based on different tests regarded the range of measurements, the presence of spike, the displacement of the platform and the correctness of the metadata. 
 
-The QC are based on existing standards for most of the platforms. They are extensively described in the Quality Information Document \citep{SOCIBQC2018}. The description platform by platform is provided in the next Section. The flags used on the data are described in Tab.~\ref{tab:qc}
+The QC are based on existing standards for most of the platforms. They are extensively described in the Quality Information Document \citep{SOCIBQC2018}. The description platform by platform is provided in the next Section.
+
+\subsubsection{Quality flags}
+The flags used on the data are described in Tab.~\ref{tab:qc}.
 
 \begin{table}[htpb]
 \caption{Quality Control Flags.\label{tab:qc}}
@@ -178,13 +179,25 @@ Code		&  Meaning  			\\
 9			&  Missing data			\\
 \bottomhline
 \end{tabular}
-\belowtable{} % Table Footnotes
 \end{table}
 
-For some platforms, the automated QC are not implemented yet:
+\subsubsection{QC tests}
+
+The main tests performed on the data are:
 \begin{description}
-\item[Gliders:] for the present version of the data, no QC are performed on this platform. A set of quality checks have been added to the glider toolbox \citep[][and available at \url{https://github.com/socib/glider_toolbox}]{TROUPIN16} and are in testing phase at the time of the writing. The QC included tests on $NaN$ values, impossible date or location, valid ranges (depending on depth) for the variables, spikes, gradients and flat lines (constant value over a large range of depths) in profiles. The later check proved useful for conductivity (and hence density and salinity). This new QC step will then be included to the general procedure and new netCDF files will be produced and made available as a new version of the present dataset.
-\item[CTD profiles:] the situation is similar to the gliders: new tests have been recently added to the processing chain at SOCIB, hence the AlborEx CTD profiles will have to be reprocessed in order to assign the quality flags to the measurements. These tests are essentially based on the range of measured values depending on each variable and the presence of strong vertical variations spike within a profile.
+\item[range:] depending on the variable considered, low and high threshold are assigned. First there is a global range: if the measured values falls outside, then the flag is set to 4 (bad data). Then a regional range test is applied: the measurements outside this range are assigned the flag 2 (probably good).
+\item[spike:] the test consists in checking the difference between sequential measurements (i.e. not measured at the same time). For the $\jmath$-th measurement:
+\begin{equation}
+\mathrm{spike} = \left|V_{\jmath} - \frac{V_{\jmath+1} + V_{\jmath-1}}{2} \right| - \left| \frac{V_{\jmath+1} - V_{\jmath-1}}{2} \right|
+\end{equation}
+When the spike value is above the threshold (depending on the variable), the flag is set to 6.
+\item[gradient:] it is computed for the variables along different coordinates (horizontal, depth, time). 
+\item[stationarity:] it aims to checks if measurements exhibit some variability over a period of time, by computing the difference between the extremal values over that period.  
+\end{description}
+
+Note that the tests described above are not yet applied on the glider data, since their processing is done outside of the general SOCIB processing chain, but the tests have been implemented in the glider toolbox \citep[][and available at \url{https://github.com/socib/glider_toolbox}]{TROUPIN16} and will be made operational once they have been properly tested and validated.
+
+
 \end{description}
 
 As the new files will not be available before a full reprocessing of all the historical missions, the decision was taken to provide the data files in their current state. A new version will be uploaded as soon as the processing has been performed.
@@ -256,8 +269,12 @@ The gliders follow a 3-dimensional trajectory in the water column but for some s
 
 \subsubsection{Configuration}
 
-The glider compass was also calibrated following \cite{MERCKELBACH2008}.
 
+\subsubsection{Quality checks}
+
+Before the deployment, glider compass was calibrated following \cite{MERCKELBACH2008}. The thermal-lag happening on the un-pumped Sea-Bird CTD sensors installed on the deep and coastal gliders is corrected using the procedure described in \citep{GARAU11}.
+
+The checks not yet applied but planned for the next release of the Glider toolbox include: the removal of $NaN$ values, the detection of impossible dates or locations, valid ranges (depending on depth) for the variables, spikes, gradients and constant value over a large range of depths in the profiles. The tests performed that the constant value check proved useful for conductivity (and hence density and salinity). A new version of the present dataset will be released once these new checks are made operational.
 
 \subsection{Surface drifters}