Constants
Specific constants to be used in turn_by_turn, to help with consistency.
- class turn_by_turn.constants.MetaDict[source]
Metadata dictionary, to type-hint known entries. None of the entries are required (
total=False).- date
Date of the measurement/creation of the data
- Type:
datetime
- file
Path to the file the data was loaded from (if available).
- Type:
Path | str
- machine
Name of the machine the data was measured/simulated on.
- Type:
str
- source_datatype
The datatype this data was loaded from.
- Type:
str
- comment
Any comment on the measurement.
- Type:
str
IO
This module contains high-level I/O functions to read and write turn-by-turn data objects to and from different formats.
Reading Data
Since version 0.9.0 of the package, data can be loaded either from file or from in-memory structures exclusive to certain codes (for some tracking simulation in MAD-NG or xtrack).
Two different APIs are provided for these use cases.
To read from file, use the
read_tbtfunction (exported asreadat the package’s level). The file format is detected or specified by thedatatypeparameter.To load in-memory data, use the
convert_to_tbtfunction (exported asconvertat the package’s level). This is valid for tracking simulation results from e.g. xtrack or sent back by MAD-NG.
In both cases, the returned value is a structured TbtData object.
Writing Data
The single entry point for writing to disk is the write_tbt function (exported as write at the package’s level). This writes a TbtData object to disk, typically in the LHC SDDS format (by default). The output file extension and format are determined by the datatype argument.
The following cases arise:
- If datatype is set to lhc, sps or ascii, the output will be in SDDS format and the file extension will be set to .sdds if not already present.
- If datatype is set to madng, the output will be in a TFS file (extension .tfs is recommended).
- Other supported datatypes (see WRITERS) will use their respective formats and conventions if implemented.
The datatype parameter controls both the output format and any additional options passed to the underlying writer.
Should the noise parameter be used, random noise will be added to the data before writing. A seed can be provided for reproducibility.
Example:
from turn_by_turn import write
write("output.sdds", tbt_data) # writes in SDDS format by default
write("output.tfs", tbt_data, datatype="madng") # writes a TFS file in MAD-NG's tracking results format
write("output.sdds", tbt_data, noise=0.01, seed=42) # reproducibly adds noise before writing
While data can be loaded from the formats of different machines/codes (each through its own reader module), writing functionality is at the moment always done in the LHC’s SDDS format by default, unless another supported format is specified. The interface is designed to be future-proof and easy to extend for new formats.
Supported Modules and Limitations
The following table summarizes which modules support disk reading and in-memory conversion, and any important limitations:
Module |
Disk Reading |
In-Memory Conversion |
Notes / Limitations |
|---|---|---|---|
lhc |
Yes (SDDS, ASCII) |
No |
Reads LHC SDDS and legacy ASCII files. |
sps |
Yes (SDDS, ASCII) |
No |
Reads SPS SDDS and legacy ASCII files. |
doros |
Yes (HDF5) |
No |
Reads DOROS HDF5 files. |
madng |
Yes (TFS) |
Yes |
In-memory: only via pandas/tfs DataFrame. |
xtrack |
No |
Yes |
Only in-memory via xtrack.Line. |
ptc |
Yes (trackone) |
No |
Reads MAD-X PTC trackone files. |
iota |
Yes (HDF5) |
No |
Reads IOTA HDF5 files. |
ascii |
Yes (legacy ASCII) |
No |
For legacy ASCII files only. |
trackone |
Yes (MAD-X) |
No |
Reads MAD-X trackone files. |
Only
madngandxtracksupport in-memory conversion.Most modules are for disk reading only.
Some modules are experimental or have limited support.
API
- turn_by_turn.io.additional_args(datatype: str) dict[str, Any][source]
Additional parameters to be added to the reader/writer function.
- Parameters:
datatype (str) -- Type of the data.
- turn_by_turn.io.convert_to_tbt(file_data: DataFrame | Line, datatype: str = 'xtrack') TbtData[source]
Convert a pandas or tfs DataFrame (MAD-NG) or a Line (XTrack) to a TbtData object. :param file_data: The data to convert. :type file_data: Union[DataFrame, xt.Line] :param datatype: The type of the data, either ‘xtrack’ or ‘madng’. Defaults to ‘xtrack’. :type datatype: str
- Returns:
The converted TbtData object.
- Return type:
TbtData
- turn_by_turn.io.read_tbt(file_path: str | Path, datatype: str = 'lhc') TbtData[source]
Calls the appropriate loader for the provided matrices type and returns a
TbtDataobject of the loaded matrices.- Parameters:
file_path (Union[str, Path]) -- path to a file containing TbtData.
datatype (str) -- type of matrices in the file, determines the reader to use. Case-insensitive, defaults to
lhc.
- Returns:
A
TbtDataobject with the loaded matrices.
- turn_by_turn.io.write_tbt(output_path: str | Path, tbt_data: TbtData, noise: float = None, seed: int = None, datatype: str = 'lhc') None[source]
Write a
TbtDataobject’s data to file, in theLHC’s SDDS format.- Parameters:
output_path (Union[str, Path]) -- path to a the disk location where to write the data.
tbt_data (TbtData) -- the
TbtDataobject to write to disk.noise (float) -- optional noise to add to the data.
seed (int) -- A given seed to initialise the RNG if one chooses to add noise. This is useful to ensure the exact same RNG state across operations. Defaults to
None, which means any new RNG operation in noise addition will pull fresh entropy from the OS.datatype (str) -- type of matrices in the file, determines the reader to use. Case-insensitive, defaults to
lhc.
Structures
Data structures to be used in turn_by_turn to store turn-by-turn measurement data.
- class turn_by_turn.structures.TbtData(matrices: Sequence[DataType], nturns: int, bunch_ids: list[int] | None = None, meta: MetaDict = <factory>)[source]
Object holding a representation of a Turn-by-Turn data measurement. The date of the measurement, the transverse data, number of turns and bunches as well as the bunch IDs are encapsulated in this object.
- matrices
Sequence of
TransverseDataorTrackingDataobjects. Each entry corresponds to a “bunch” or “particle” (tracking).- Type:
Sequence[DataType]
- nturns
Number of turns. Needs to be a positive integer. It is assumed all bunches (and observation points in all entries therein) have the same length in the turn-dimension (columns).
- Type:
int
- bunch_ids
List of bunch/particle IDs. Will default to
[0, 1, ..., nbunches-1]if not set.- Type:
list[int] | None
- meta
Dictionary of metadata.
- Type:
MetaDict
- nbunches
Number of bunches/particles. Automatically set (i.e. cannot be set in the initialization of this object).
- Type:
int
- class turn_by_turn.structures.TrackingData(X: pd.DataFrame, PX: pd.DataFrame, Y: pd.DataFrame, PY: pd.DataFrame, T: pd.DataFrame, PT: pd.DataFrame, S: pd.DataFrame, E: pd.DataFrame)[source]
Object holding multidimensional turn-by-turn simulation data in the form of pandas DataFrames.
The DataFrames should be N_observation-points x M_turns matrices, and usually contain the BPM/observation-point names as index, while the columns are simply numbered starting from
0. All DataFrames should have the same N x M size.- X
Horizontal position data
- Type:
pd.DataFrame
- PX
Horizontal momentum data
- Type:
pd.DataFrame
- Y
Vertical position data
- Type:
pd.DataFrame
- PY
Vertical momentum data
- Type:
pd.DataFrame
- T
Negative time difference wrt. the reference particle (multiplied by c)
- Type:
pd.DataFrame
- PT
Energy difference wrt. the reference particle divided by the ref. momentum (multiplied by c)
- Type:
pd.DataFrame
- S
Longitudinal position data
- Type:
pd.DataFrame
- E
Energy data
- Type:
pd.DataFrame
- classmethod fieldnames() list[str][source]
Return a list of the fields of this dataclass.
- class turn_by_turn.structures.TransverseData(X: pd.DataFrame, Y: pd.DataFrame)[source]
Object holding measured turn-by-turn data for both transverse planes in the form of pandas DataFrames.
The DataFrames should be N_(observation-points) x M_(turns) matrices, and usually contain the BPM/observation-point names as index, while the columns are simply numbered starting from
0. All DataFrames should have the same N x M size.- X
Horizontal position data
- Type:
pd.DataFrame
- Y
Vertical position data
- Type:
pd.DataFrame
- classmethod fieldnames() list[str][source]
Return a list of the fields of this dataclass.
Utils
Utility functions for convenience operations on turn-by-turn data objects in this package.
- turn_by_turn.utils.add_noise(data: ndarray, noise: float | None = None, sigma: float | None = None, seed: int | None = None) ndarray[source]
Returns the given data with added noise. Noise is generated as a standard normal distribution (mean=0, standard_deviation=1) with the size of the input data, and scaled by the a factor before being added to the provided data. Said factor can either be provided, or calculated from the input data’s own standard deviation.
- Parameters:
data (np.ndarray) -- your input data.
noise (float) -- the scaling factor applied to the generated noise.
sigma (float) -- if provided, then that number times the standard deviation of the input data will be used as scaling factor for the generated noise.
seed (int) -- a given seed to initialise the RNG.
- Returns:
A new numpy array with added noise to the provided data.
- turn_by_turn.utils.add_noise_to_tbt(data: TbtData, noise: float = None, sigma: float = None, seed: int = None) TbtData[source]
Returns a new copy of the given TbT data with added noise. The noise is generated by
turn_by_turn.utils.add_noise()from a single rng, i.e. the noise is not repeated on each dataframe.- Parameters:
data (TbtData) -- your input TbT-data.
noise (float) -- the scaling factor applied to the generated noise.
sigma (float) -- if provided, then that number times the standard deviation of the input data will be used as scaling factor for the generated noise.
seed (int) -- a given seed to initialise the RNG.
- Returns:
A copy of the TbtData with noised data on all matrices.
- turn_by_turn.utils.all_elements_equal(iterable: Iterable) bool[source]
Check if all elements in an iterable are equal. WARNING: Does not necissarily work with floating point numbers.
- Parameters:
iterable (Iterable) -- an iterable to check.
- Returns:
Trueif all elements are equal,Falseotherwise.- Return type:
bool
- turn_by_turn.utils.generate_average_tbtdata(tbtdata: TbtData) TbtData[source]
Takes a
TbtDataobject and returns another containing the averaged matrices over all bunches/particles at all used BPMs.- Parameters:
tbtdata (TbtData) -- entry TbtData object from measurements.
- Returns:
A new TbtData object with the averaged matrices.
- turn_by_turn.utils.get_averaged_data(bpm_names: Sequence[str], matrices: Sequence[TransverseData], plane: str, turns: int) np.ndarray[source]
Average data from a given plane from the matrices of a
TbtData.- Parameters:
bpm_names (Sequence[str])
matrices (Sequence[TransverseData]) -- matrices from a
TbtDataobject.plane (str) -- name of the given plane to average in.
turns (int) -- number of turns in the provided data.
- Returns:
A numpy array with the averaged data for the given bpms.
- turn_by_turn.utils.matrices_to_array(tbt_data: TbtData) ndarray[source]
Convert the matrices of a
TbtDataobject to a numpy array.- Parameters:
tbt_data (TbtData) --
TbtDataobject to convert the data from.- Returns:
A numpy array with the matrices data.
- turn_by_turn.utils.numpy_to_tbt(names: ~numpy.ndarray, matrix: ~numpy.ndarray, datatype: ~turn_by_turn.structures.TransverseData | ~turn_by_turn.structures.TrackingData = <class 'turn_by_turn.structures.TransverseData'>) TbtData[source]
Converts turn by turn matrices and names into a
TbTDataobject.- Parameters:
names (np.ndarray) -- Numpy array of BPM names.
matrix (np.ndarray) -- 4D Numpy array [quantity, BPM, particle/bunch No., turn No.] quantities in order [x, y].
datatype (DataType) -- The type of data to be converted to in the matrices. Either
TransverseData(which implies readingXandYfields) orTrackingData(which implies reading all 8 fields). Defaults toTransverseData.
- Returns:
A
TbtDataobject loaded with the matrices in the provided numpy arrays.