Utils¶
Package Content¶
Module utils.absolute_ratio_beta_amp_beta_phase¶
Module utils.ADDbpnerror¶
Created April, 2014
maintainer: | Yngve Inntjore Levinsen |
---|---|
author: | Yngve Inntjore Levinsen (based on awk script by Rogelio Tomas) |
-
utils.ADDbpmerror.
convert_files
(nparticles=1, infile='trackone', outfile='ALLBPMs', x_error=0.0, y_error=0.0, n_faulty=0)[source]¶ Similar to the old awk script ALLBPMs.
The main difference is that this can handle track files with more than one particle. In this case it will write each particle track to different files
Parameters: - nparticles (int) -- Number of particles in the track file
- infile (string) -- Name of input file
- outfile (string) -- Name of output file
- x_error (float) -- Sigma of noise in horizontal plane
- y_error (float) -- Sigma of noise in vertical plane
- n_faulty (int) -- Number of faulty BPMs
Module utils.bpm¶
Created on 3 Jun 2013
This module contains helper functions concerning bpms. It contains functions to filter BPMs or to intersect BPMs in multiple files or with a given model file.
-
utils.bpm.
filterbpm
(list_of_bpms)[source]¶ Filter non-arc BPM. :returns: list -- a list with those bpms which start with name “BPM.”
-
utils.bpm.
get_list_of_tuples
(bpms)[source]¶ transforms the DataFrame bpms to a list of tuples to fake the old usage.
-
utils.bpm.
intersect
(list_of_twiss_files)[source]¶ Pure intersection of all bpm names in all files.
Parameters: list_of_twiss_files (list) -- List of metaclass.Twiss objects with columns NAME and S. Returns: list with tuples: (<S_value_i>,<bpm_i>) -- bpm_i is in every twiss of list_of_twiss_files.
-
utils.bpm.
intersect_with_bpm_list
(exp_bpms, bpm_list)[source]¶ Intersects BPMs from
Parameters: - exp_bpms' list with tuples (list) -- (<S_value_i>,<bpm_i>)
- bpm_list (list) -- List of bpm names
Returns: list with tuples: (<S_value_i>,<bpm_i>) -- A list with BPMs which are both in exp_bpms and bpm_list.
-
utils.bpm.
model_intersect
(exp_bpms, model_twiss)[source]¶ Intersects BPMs from
Parameters: - exp_bpms (list) -- list with tuples (<S_value_i>,<bpm_i>)
- model_twiss (metaclass.Twiss) -- Should be a Twiss object from model
Returns: list with tuples: (<S_value_i>,<bpm_i>) -- A list with BPMs which are both in exp_bpms and model_twiss.
Module utils.contexts¶
Module utils.convert2json¶
Created on ??
Loads the dictionaries which are the return values of the AllLists.py python script functions and dumps it as a json file.
Module utils.dict_tools¶
-
class
utils.dict_tools.
DictParser
(dictionary=None, strict=False)[source]¶ Provides functions to parse a dictionary.
First build a dictionary structure with Arguments as leafs via add_argument or on init. A similar structured option dictionary with the values as leafs can then be parsed.
-
add_argument_dict
(dictionary, loc)[source]¶ Appends a complete subdictionary to existing argument structure at node ‘loc’.
Parameters: - loc -- location of the node to append the sub-dictionary
- dictionary -- The dictionary to append
Returns: This object
-
add_parameter
(param, **kwargs)[source]¶ Adds an parameter to the parser.
If you want it to be an parameter of a sub-dictionary add the ‘loc=subdict.subdict’ keyword to the input.
Parameters: - param -- Argument to add (either of object of class argument or string defining the name)
- kwargs -- Any of the argument-fields (apart from ‘name’) and/or ‘loc’
Returns: This object
-
parse_arguments
(arguments)[source]¶ Parse a given option dictionary and return parsed options.
Parameters: arguments -- Arguments to parse Returns: Parsed options
-
-
utils.dict_tools.
get_subdict
(full_dict, keys, strict=True)[source]¶ Returns a sub-dictionary of
full_dict
containing only keys ofkeys
.Parameters: - full_dict -- Dictionary to extract from
- keys -- keys to extract
- strict -- If false it ignores keys not in full_dict. Otherwise it crashes on those. Default: True
Returns: Extracted sub-dictionary
Module utils.entry_datatypes¶
Advanced Datatypes to add as type to entrypoint. Or any parser, really.
-
class
utils.entry_datatypes.
BoolOrList
[source]¶ A class that behaves like a boolean when possible, otherwise like a list.
Hint: ‘list.__new__(list, value)’ returns an empty list.
-
class
utils.entry_datatypes.
BoolOrString
[source]¶ A class that behaves like a boolean when possible, otherwise like a string.
-
utils.entry_datatypes.
get_instance_faker_meta
(*classes)[source]¶ Returns the metaclass that fakes the isinstance() checks.
-
utils.entry_datatypes.
get_multi_class
(*classes)[source]¶ Create a class ‘behaving’ like all classes in classes.
In case a value needs to be converted to a class in this list, it is attempted to cast the input to the classes in the given order (i.e. string-classes need to go to the end, as they ‘always’ succeed).
Module utils.entrypoint¶
Entry Point Decorator
Allows a function to be decorated as entrypoint. This function will then automatically accept console arguments, config files, json files, kwargs and dictionaries as input and will parse it according to the parameters given to the entrypoint-Decorator.
Terminology:¶
- Parameter - Items containing info on how to parse Arguments
- Arguments - The input to the wrapped-function
- Options - The parsed arguments and hence the options of the function
Hence, an ArgumentError
will be raised in case of something going wrong during parsing,
while ParameterErrors
will be raised if something goes wrong when
adding parameters to the list.
Usage:¶
To be used as a decorator:
@entrypoint(parameters)
def some_function(options, unknown_options)
Using strict mode (see below):
@entrypoint(parameters, strict=True)
def some_function(options)
It is also possible to use the EntryPoint Class similar to a normal parser:
ep_parser = EntryPoint(parameters)
options, unknown_options = ep_parser.parse(arguments)
Using strict mode (see below):
ep_parser = EntryPoint(parameters, strict=True)
options = ep_parser.parse(arguments)
Parameters:¶
Parameters need to be a list or a dictionary of dictionaries with the following keys:
--file
bool
str
type
(if nargs is given, set to list for dicts!)type
, if given)REMAINDER
!)store_true
or store_false
, will set type
to bool
and the default to False
and True
respectively.The strict option changes the behaviour for unknown parameters:
strict=True
raises exceptions, strict=False
loggs debug messages and returns the options.
Hence a wrapped function with strict=True
must accept one input, with strict=False
two.
Default: False
-
class
utils.entrypoint.
EntryPointParameters
(*args, **kwargs)[source]¶ Helps to build a simple dictionary structure via add_argument functions.
You really don’t need that, but old habits die hard.
-
utils.entrypoint.
add_params_to_generic
(parser, params)[source]¶ Adds entry-point style parameter to either ArgumentParser, DictParser or EntryPointArguments
-
utils.entrypoint.
create_parameter_help
(module, param_fun=None)[source]¶ Print params help quickly but changing the logging format first.
Usage Example:
import amplitude_detuning_analysis create_parameter_help(amplitude_detuning_analysis) create_parameter_help(amplitude_detuning_analysis, "_get_plot_params")
-
class
utils.entrypoint.
entrypoint
(parameter, strict=False)[source]¶ Decorator extension of EntryPoint.
Implements the __call__ method needed for decorating. Lowercase looks nicer if used as decorator
-
utils.entrypoint.
param_names
(params)[source]¶ Get the names of the parameters, no matter if they are a dict or list of dicts
-
utils.entrypoint.
split_arguments
(args, *param_list)[source]¶ Divide remaining arguments into a list of argument-dicts, fitting to the params in param_list.
Parameters: - args -- Input arguments, either as list of strings or dict
- param_list -- list of sets of entry-point parameters (either dict, or list)
Returns: A list of dictionaries containing the arguments for each set of parameters, plus one more entry for unknown parameters. If the input was a list of argument-strings, the parameters will already be parsed.
Warning
Unless you know what you are doing, run this function only on remaining-arguments from entry point parsing, not on the actual arguments
Warning
Adds each argument only once, to the set of params who claim it first!
Module utils.error_handling¶
Tools to handle errors easier (and to not to have to look it up on StackOverflow all the time)
-
utils.error_handling.
append_error_message
(err, message)[source]¶ Append ‘message’ to the message of error ‘err’
Module utils.htcondor_wrapper¶
Tools to use htcondor via python. Requires you to be on a computer where htcondor is set up. For local machines see: https://twiki.cern.ch/twiki/bin/view/ABPComputing/LxbatchHTCondor
Hint: I had to add the path of the newly installed modules to the batch_Krb5_credential script to be able to find the Authen::Krb5 package: (use lib “/home/jdilly/perl5/lib/perl5/x86_64-linux-gnu-thread-multi/”;)
Also, make sure the SCHEDD_NAME is set properly in the config file mentioned above. (i.e. go to lxplus, run condor_q and see which scheduler was assigned to you)
If you run jobs that take longer than 25h, make sure you are using a renewable Kerberos ticket: kinit -r 604800 (i.e. renewable by “kinit -R” for 1 week)
Python Readme: https://htcondor-python.readthedocs.io/en/latest
IMPORTANT: This functionality relies on shared-space of all files between htcondor and the user.
-
utils.htcondor_wrapper.
create_job_for_bashfile
(bashfile, duration='longlunch')[source]¶ Returns a simple Submit() object for the bashfile.
-
utils.htcondor_wrapper.
create_multijob_for_bashfiles
(folder, n_files, duration='longlunch')[source]¶ Function to create a HTCondor job assuming n_files bash-files.
-
utils.htcondor_wrapper.
create_subfile_from_job
(folder, job)[source]¶ Write file to submit to htcondor
Module utils.iotools¶
Created on 1 Jul 2013
utils.iotools.py holds helper functions for input/output issues. This module is not intended to be executed.
Feel free to use and extend this module.
-
utils.iotools.
append_string_to_textfile
(path_to_textfile, str_to_append)[source]¶ If file does not exist, a new file will be created.
-
utils.iotools.
copy_content_of_dir
(src_dir, dst_dir)[source]¶ Copies all files and directories from src_dir to dst_dir.
-
utils.iotools.
copy_item
(src_item, dest)[source]¶ Copies a file or a directory to dest. dest may be a directory. If src_item is a directory then all containing files and dirs will be copied into dest.
-
utils.iotools.
deleteFilesWithoutGitignore
(pathToDirectory)[source]¶ Deletes all files in the given pathToDirectory except of the file with the name ‘.gitignore’
Returns: bool -- True if the directory exists and the files are deleted otherwise False.
-
utils.iotools.
delete_content_of_dir
(path_to_dir)[source]¶ Deletes all folders, files and symbolic links in given directory. :param string path_to_dir:
-
utils.iotools.
delete_item
(path_to_item)[source]¶ Deletes the item given by path_to_item. It distinguishes between a file, a directory and a symbolic link.
-
utils.iotools.
get_all_absolute_filenames_in_dir_and_subdirs
(path_to_dir)[source]¶ Looks for files(not dirs) in dir and subdirs and returns them as a list.
-
utils.iotools.
get_all_dir_names_in_dir
(path_to_dir)[source]¶ Looks for directories in dir and returns them as a list
-
utils.iotools.
get_all_filenames_in_dir
(path_to_dir)[source]¶ Looks for files in dir(not subdir) and returns them as a list
-
utils.iotools.
get_all_filenames_in_dir_and_subdirs
(path_to_dir)[source]¶ Looks for files(not dirs) in dir and subdirs and returns them as a list.
-
utils.iotools.
json_dumps_readable
(json_outfile, object)[source]¶ This is how you write a beautiful json file
Parameters: - json_outfile -- File to write
- object -- object to dump
-
utils.iotools.
replace_keywords_in_textfile
(path_to_textfile, dict_for_replacing, new_output_path=None)[source]¶ This function replaces all keywords in a textfile with the corresponding values in the dictionary. E.g.: A textfile with the content “%(This)s will be replaced!” and the dict {“This”:”xyz”} results to the change “xyz will be replaced!” in the textfile.
Parameters: new_output_path -- If new_output_path is None, then the source file will be replaced.
Module utils.logging_tools¶
-
class
utils.logging_tools.
DebugMode
(active=True, log_file=None, add_date_to_fname=True)[source]¶ Context Manager for the debug mode.
Hint: Does not work with @contextmanager from contextlib (even though nicer code), as the _get_caller would find the contextlib.py
Parameters: - active (bool) -- Defines if this manager is doing anything. (Default:
True
) - log_file (str) -- File to log into.
- active (bool) -- Defines if this manager is doing anything. (Default:
-
class
utils.logging_tools.
TempFile
(file_path, log_func)[source]¶ Context Manager. Lets another function write into a temporary file and logs its contents.
It won’t open the file though, so only the files path is returned.
Parameters: - file_path (str) -- Place to write the tempfile to.
- log_func (func) -- The function with which the content should be logged (e.g. LOG.info)
-
utils.logging_tools.
file_handler
(logfile, level=10, fmt='%(levelname)7s | %(message)s | %(name)s')[source]¶ Convenience function so the caller does not have to import logging
-
utils.logging_tools.
getLogger
(name)[source]¶ Convenience function so the caller does not have to import logging
-
utils.logging_tools.
get_logger
(name, level_root=10, level_console=20, fmt='%(levelname)7s | %(message)s | %(name)s')[source]¶ Sets up logger if name is __main__. Returns logger based on module name)
Parameters: - name -- only used to check if __name__ is __main__
- level_root -- main logging level, default DEBUG
- level_console -- console logging level, default INFO
- fmt -- Format of the logging. For default see BASIC_FORMAT
Returns: Logger instance.
-
utils.logging_tools.
log_pandas_settings_with_copy
(*args, **kwds)[source]¶ Logs pandas SettingsWithCopy warning to loc_func instead of printing the warning.
-
utils.logging_tools.
logging_silence
(*args, **kwds)[source]¶ Remove temporarily all loggers from root logger.
-
utils.logging_tools.
stream_handler
(stream=<open file '<stdout>', mode 'w'>, level=10, fmt='%(levelname)7s | %(message)s | %(name)s', max_level=None)[source]¶ Convenience function so the caller does not have to import logging
-
utils.logging_tools.
unformatted_console_logging
(*args, **kwds)[source]¶ Log only to console and only unformatted.
Module utils.math_tools¶
Mathematical helper functions for everyone.
-
utils.math_tools.
get_next_scientific_exponent
(number)[source]¶ Returns the next exponent of 10 which is a multiple of 3, so that the coefficient is <100.
Parameters: number -- number to convert Returns: Tupel of coefficient and exponent.
-
utils.math_tools.
mad
(arr)[source]¶ Median Absolute Deviation: a “Robust” version of standard deviation. Indices variabililty of the sample. https://en.wikipedia.org/wiki/Median_absolute_deviation Source: https://stackoverflow.com/a/23535934/5609590
Parameters: arr -- numpy array
Module utils.print_cprofile¶
Module utils.read_bet_deviations¶
Module utils.reindent¶
reindent [-d][-r][-v] [ path … ]
-d (--dryrun) Dry run. Analyze, but don’t make any changes to, files. -r (--recurse) Recurse. Search for all .py files in subdirectories too. -n (--nobackup) No backup. Does not make a “.bak” file before reindenting. -v (--verbose) Verbose. Print informative msgs; else no output. -h (--help) Help. Print this usage information and exit.
Change Python (.py) files to use 4-space indents and no hard tab characters. Also trim excess spaces and tabs from ends of lines, and remove empty lines at the end of files. Also ensure the last line ends with a newline.
If no paths are given on the command line, reindent operates as a filter, reading a single source file from standard input and writing the transformed source to standard output. In this case, the -d, -r and -v flags are ignored.
You can pass one or more file and/or directory paths. When a directory path, all .py files within the directory will be examined, and, if the -r option is given, likewise recursively for subdirectories.
If output is not to standard output, reindent overwrites files in place, renaming the originals with a .bak extension. If it finds nothing to change, the file is left alone. If reindent does change a file, the changed file is a fixed-point for future runs (i.e., running reindent on the resulting .py file won’t change it again).
The hard part of reindenting is figuring out what to do with comment lines. So long as the input files get a clean bill of health from tabnanny.py, reindent should do a good job.
The backup file is a copy of the one that is being reindented. The “.bak” file is generated with shutil.copy(), but some corner cases regarding user/group and permissions could leave the backup file more readable that you’d prefer. You can always use the --nobackup option to prevent this.
Module utils.stats¶
Created on 03/07/18
author: | Lukas Malina |
---|
- Provides statistical methods to compute:
- various weighted averages along specified axis and their errors unbiased error estimator of infinite normal distribution from finite-sized sample
TODO use weighted average and its error in circular calculations TODO write tests TODO LOGGER or Raising error and warnings? TODO if zeros or nans occur in errors, fallback to uniform weights only in affected cases
-
utils.stats.
circular_error
(data, period=6.283185307179586, errors=None, axis=None, t_value_corr=True)[source]¶ Computes error of weighted circular average along the specified axis.
Parameters: - data -- array-like Contains the data to be averaged
- period -- scalar, optional Periodicity period of data, default is (2 * np.pi)
- errors -- array-like, optional Contains errors associated with the values in data, it is used to calculated weights
- axis -- int or tuple of ints, optional Axis or axes along which to average data
- t_value_corr -- bool, optional Species if the error is corrected for small sample size, default True
Returns: Returns the error of weighted circular average along the specified axis.
-
utils.stats.
circular_mean
(data, period=6.283185307179586, errors=None, axis=None)[source]¶ Computes weighted circular average along the specified axis.
Parameters: - data -- array-like Contains the data to be averaged
- period -- scalar, optional, default (2 * np.pi) Periodicity period of data
- errors -- array-like, optional Contains errors associated with the values in data, it is used to calculated weights
- axis -- int or tuple of ints, optional Axis or axes along which to average data
Returns: Returns the weighted circular average along the specified axis.
-
utils.stats.
effective_sample_size
(data, weights, axis=None)[source]¶ Computes effective sample size of weighted data along specifies axis
Parameters: - data -- array-like
- weights -- array-like Contains weights associated with the values in data
- axis -- int or tuple of ints, optional Axis or axes along which the effective sample size is computed
Returns: Returns the error of weighted circular average along the specified axis.
-
utils.stats.
t_value_correction
(sample_size)[source]¶ Calculates the multiplicative correction factor to determine standard deviation of normally distributed quantity from standard deviation of its finite-sized sample
Parameters: sample_size -- array-like Returns: - multiplicative correction factor(s) of same shape as sample_size
- can contain nans
-
utils.stats.
unbias_variance
(data, weights, axis=None)[source]¶ Computes a correction factor to unbias variance of weighted average of data along specified axis
Parameters: - data -- array-like
- weights -- array-like Contains weights associated with the values in data
- axis -- int or tuple of ints, optional Axis or axes along which the effective sample size is computed
Returns: Returns the error of weighted circular average along the specified axis.
-
utils.stats.
weighted_error
(data, errors=None, axis=None, t_value_corr=True)[source]¶ Computes error of weighted average along the specified axis.
Parameters: - data -- array-like Contains the data to be averaged
- errors -- array-like, optional Contains errors associated with the values in data, it is used to calculated weights
- axis -- int or tuple of ints, optional Axis or axes along which to average data
- t_value_corr -- bool, optional Species if the error is corrected for small sample size, default True
Returns: Returns the error of weighted average along the specified axis.
-
utils.stats.
weighted_mean
(data, errors=None, axis=None)[source]¶ Computes weighted average along the specified axis.
Parameters: - data -- array-like Contains the data to be averaged
- errors -- array-like, optional Contains errors associated with the values in data, it is used to calculated weights
- axis -- int or tuple of ints, optional Axis or axes along which to average data
Returns: Returns the weighted average along the specified axis.
-
utils.stats.
weighted_rms
(data, errors=None, axis=None)[source]¶ Computes weighted root mean square along the specified axis.
Parameters: - data -- array-like Contains the data to be averaged
- errors -- array-like, optional Contains errors associated with the values in data, it is used to calculated weights
- axis -- int or tuple of ints, optional Axis or axes along which to average data
Returns: Returns weighted root mean square along the specified axis.
-
utils.stats.
weights_from_errors
(errors, period=6.283185307179586)[source]¶ Computes weights from measurement errors, weights are not output if errors contain zeros or nans
Parameters: - errors -- array-like Contains errors which are used to calculated weights
- period -- scalar, optional Periodicity period of data, default is (2 * np.pi)
Returns: Returns the error of weighted circular average along the specified axis.