IRamanSpectraWorkChain

IRamanSpectraWorkChain#

workchainaiida_vibroscopy.workflows.spectra.iraman.IRamanSpectraWorkChain

Workchain for automatically compute IR and Raman spectra using finite displacements and fields. For other details of the sub-workchains used, see also: * :class:`~aiida_vibroscopy.workflows.dielectric.base.DielectricWorkChain` for finite fields * :class:`~aiida_vibroscopy.workflows.phonons.base.PhononWorkChain` for finite displacements

Inputs:

  • clean_workdir, Bool, optional – If True, work directories of all called calculation will be cleaned at the end of execution.
  • dielectric, Namespace – Inputs for the DielectricWorkChain that will beused to calculate the mixed derivatives with electric field.
    • central_difference, Namespace – The inputs for the central difference scheme.
      • accuracy, (Int, NoneType), optional – Central difference scheme accuracy to employ (i.e. number of points for derivative evaluation). This must be an EVEN positive integer number. If not specified, an automatic choice is made upon the intensity of the critical electric field.
      • diagonal_scale, Float, optional – Scaling factor for electric fields non-parallel to cartesiaan axis (e.g. scale*(E,E,O)). 1/Sqrt(2) guarantees the same norm in all directions (recommended).
      • electric_field_step, (Float, NoneType), optional – Electric field step in Ry atomic units used in the numerical differenciation. Only positive values. If not specified, an NSCF is run to evaluate the critical electric field; an electric field step is then extracted to secure a stable SCF.
    • clean_workdir, Bool, optional – If True, work directories of all called calculation will be cleaned at the end of execution.
    • kpoints_parallel_distance, (Float, NoneType), optional – Distance of the k-points in reciprocal space along the parallel direction of each applied electric field.
    • metadata, Namespace
      • call_link_label, str, optional, is_metadata – The label to use for the CALL link if the process is called by another process.
      • description, (str, NoneType), optional, is_metadata – Description to set on the process node.
      • label, (str, NoneType), optional, is_metadata – Label to set on the process node.
      • store_provenance, bool, optional, is_metadata – If set to False provenance will not be stored in the database.
    • parent_scf, (RemoteData, NoneType), optional – Scf parent folder from where restarting the scfs with electric fields.
    • property, str, required, non_db
      Valid inputs are:ir
      • born-charges

      • dielectric

      • nac

      • bec

      • raman

      • susceptibility-derivative

      • non-linear-susceptibility

    • scf, Namespace – Inputs for the PwBaseWorkChain that will be used to run the electric enthalpy scfs.
      • handler_overrides, (Dict, NoneType), optional – Mapping where keys are process handler names and the values are a dictionary, where each dictionary can define the enabled and priority key, which can be used to toggle the values set on the original process handler declaration.
      • kpoints, (KpointsData, NoneType), optional – An explicit k-points list or mesh. Either this or kpoints_distance has to be provided.
      • kpoints_distance, (Float, NoneType), optional – The minimum desired distance in 1/Å between k-points in reciprocal space. The explicit k-points will be generated automatically by a calculation function based on the input structure.
      • kpoints_force_parity, (Bool, NoneType), optional – Optional input when constructing the k-points based on a desired kpoints_distance. Setting this to True will force the k-point mesh to have an even number of points along each lattice vector except for any non-periodic directions.
      • max_iterations, Int, optional – Maximum number of iterations the work chain will restart the process to finish successfully.
      • metadata, Namespace
        • call_link_label, str, optional, is_metadata – The label to use for the CALL link if the process is called by another process.
        • description, (str, NoneType), optional, is_metadata – Description to set on the process node.
        • label, (str, NoneType), optional, is_metadata – Label to set on the process node.
        • store_provenance, bool, optional, is_metadata – If set to False provenance will not be stored in the database.
      • pw, Namespace
        • code, (AbstractCode, NoneType), optional – The Code to use for this job. This input is required, unless the remote_folder input is specified, which means an existing job is being imported and no code will actually be run.
        • hubbard_file, (SinglefileData, NoneType), optional – SinglefileData node containing the output Hubbard parameters from a HpCalculation
        • metadata, Namespace
          • call_link_label, str, optional, is_metadata – The label to use for the CALL link if the process is called by another process.
          • computer, (Computer, NoneType), optional, is_metadata – When using a “local” code, set the computer on which the calculation should be run.
          • description, (str, NoneType), optional, is_metadata – Description to set on the process node.
          • dry_run, bool, optional, is_metadata – When set to True will prepare the calculation job for submission but not actually launch it.
          • label, (str, NoneType), optional, is_metadata – Label to set on the process node.
          • options, Namespace
            • account, (str, NoneType), optional, is_metadata – Set the account to use in for the queue on the remote computer
            • additional_retrieve_list, (list, tuple, NoneType), optional, is_metadata – List of relative file paths that should be retrieved in addition to what the plugin specifies.
            • append_text, str, optional, is_metadata – Set the calculation-specific append text, which is going to be appended in the scheduler-job script, just after the code execution
            • custom_scheduler_commands, str, optional, is_metadata – Set a (possibly multiline) string with the commands that the user wants to manually set for the scheduler. The difference of this option with respect to the prepend_text is the position in the scheduler submission file where such text is inserted: with this option, the string is inserted before any non-scheduler command
            • environment_variables, dict, optional, is_metadata – Set a dictionary of custom environment variables for this calculation
            • environment_variables_double_quotes, bool, optional, is_metadata – If set to True, use double quotes instead of single quotes to escape the environment variables specified in environment_variables.
            • import_sys_environment, bool, optional, is_metadata – If set to true, the submission script will load the system environment variables
            • input_filename, str, optional, is_metadata
            • max_memory_kb, (int, NoneType), optional, is_metadata – Set the maximum memory (in KiloBytes) to be asked to the scheduler
            • max_wallclock_seconds, (int, NoneType), optional, is_metadata – Set the wallclock in seconds asked to the scheduler
            • mpirun_extra_params, (list, tuple), optional, is_metadata – Set the extra params to pass to the mpirun (or equivalent) command after the one provided in computer.mpirun_command. Example: mpirun -np 8 extra_params[0] extra_params[1] … exec.x
            • output_filename, str, optional, is_metadata
            • parser_name, str, optional, is_metadata
            • prepend_text, str, optional, is_metadata – Set the calculation-specific prepend text, which is going to be prepended in the scheduler-job script, just before the code execution
            • priority, (str, NoneType), optional, is_metadata – Set the priority of the job to be queued
            • qos, (str, NoneType), optional, is_metadata – Set the quality of service to use in for the queue on the remote computer
            • queue_name, (str, NoneType), optional, is_metadata – Set the name of the queue on the remote computer
            • rerunnable, (bool, NoneType), optional, is_metadata – Determines if the calculation can be requeued / rerun.
            • resources, dict, required, is_metadata – Set the dictionary of resources to be used by the scheduler plugin, like the number of nodes, cpus etc. This dictionary is scheduler-plugin dependent. Look at the documentation of the scheduler for more details.
            • scheduler_stderr, str, optional, is_metadata – Filename to which the content of stderr of the scheduler is written.
            • scheduler_stdout, str, optional, is_metadata – Filename to which the content of stdout of the scheduler is written.
            • stash, Namespace – Optional directives to stash files after the calculation job has completed.
              • source_list, (tuple, list, NoneType), optional, is_metadata – Sequence of relative filepaths representing files in the remote directory that should be stashed.
              • stash_mode, (str, NoneType), optional, is_metadata – Mode with which to perform the stashing, should be value of aiida.common.datastructures.StashMode.
              • target_base, (str, NoneType), optional, is_metadata – The base location to where the files should be stashd. For example, for the copy stash mode, this should be an absolute filepath on the remote computer.
            • submit_script_filename, str, optional, is_metadata – Filename to which the job submission script is written.
            • withmpi, bool, optional, is_metadata
            • without_xml, (bool, NoneType), optional, is_metadata – If set to True the parser will not fail if the XML file is missing in the retrieved folder.
          • store_provenance, bool, optional, is_metadata – If set to False provenance will not be stored in the database.
        • monitors, Namespace – Add monitoring functions that can inspect output files while the job is running and decide to prematurely terminate the job.
        • parallelization, (Dict, NoneType), optional – Parallelization options. The following flags are allowed: npool : The number of ‘pools’, each taking care of a group of k-points. nband : The number of ‘band groups’, each taking care of a group of Kohn-Sham orbitals. ntg : The number of ‘task groups’ across which the FFT planes are distributed. ndiag : The number of ‘linear algebra groups’ used when parallelizing the subspace diagonalization / iterative orthonormalization. By default, no parameter is passed to Quantum ESPRESSO, meaning it will use its default.
        • parameters, Dict, required – The input parameters that are to be used to construct the input file.
        • pseudos, Namespace – A mapping of UpfData nodes onto the kind name to which they should apply.
        • remote_folder, (RemoteData, NoneType), optional – Remote directory containing the results of an already completed calculation job without AiiDA. The inputs should be passed to the CalcJob as normal but instead of launching the actual job, the engine will recreate the input files and then proceed straight to the retrieve step where the files of this RemoteData will be retrieved as if it had been actually launched through AiiDA. If a parser is defined in the inputs, the results are parsed and attached as output nodes as usual.
        • settings, (Dict, NoneType), optional – Optional parameters to affect the way the calculation job and the parsing are performed.
        • vdw_table, (SinglefileData, NoneType), optional – Optional van der Waals table contained in a SinglefileData.
    • settings, Namespace – Options for how to run the workflow.
      • sleep_submission_time, (int, float), optional, non_db – Time in seconds to wait before submitting subsequent displaced structure scf calculations.
  • intensities_average, Namespace – Inputs for the IntensitiesAverageWorkChain that willbe used to run the average calculation over intensities.
    • metadata, Namespace
      • call_link_label, str, optional, is_metadata – The label to use for the CALL link if the process is called by another process.
      • description, (str, NoneType), optional, is_metadata – Description to set on the process node.
      • label, (str, NoneType), optional, is_metadata – Label to set on the process node.
      • store_provenance, bool, optional, is_metadata – If set to False provenance will not be stored in the database.
    • parameters, Dict, optional – Options for averaging on the non-analytical directions.
  • metadata, Namespace
    • call_link_label, str, optional, is_metadata – The label to use for the CALL link if the process is called by another process.
    • description, (str, NoneType), optional, is_metadata – Description to set on the process node.
    • label, (str, NoneType), optional, is_metadata – Label to set on the process node.
    • store_provenance, bool, optional, is_metadata – If set to False provenance will not be stored in the database.
  • phonon, Namespace – Inputs for the PhononWorkChain that will beused to calculate the force constants.
    • clean_workdir, Bool, optional – If True, work directories of all called calculation will be cleaned at the end of execution.
    • displacement_generator, (Dict, NoneType), optional –
      Info for displacements generation. The following flags are allowed:

      distance is_plusminus is_diagonal is_trigonal number_of_snapshots random_seed cutoff_frequency

    • metadata, Namespace
      • call_link_label, str, optional, is_metadata – The label to use for the CALL link if the process is called by another process.
      • description, (str, NoneType), optional, is_metadata – Description to set on the process node.
      • label, (str, NoneType), optional, is_metadata – Label to set on the process node.
      • store_provenance, bool, optional, is_metadata – If set to False provenance will not be stored in the database.
    • phonopy, Namespace – Inputs for the PhonopyCalculation that willbe used to calculate the inter-atomic force constants, or for post-processing.
      • code, (AbstractCode, NoneType), optional – The Code to use for this job. This input is required, unless the remote_folder input is specified, which means an existing job is being imported and no code will actually be run.
      • metadata, Namespace
        • call_link_label, str, optional, is_metadata – The label to use for the CALL link if the process is called by another process.
        • computer, (Computer, NoneType), optional, is_metadata – When using a “local” code, set the computer on which the calculation should be run.
        • description, (str, NoneType), optional, is_metadata – Description to set on the process node.
        • dry_run, bool, optional, is_metadata – When set to True will prepare the calculation job for submission but not actually launch it.
        • label, (str, NoneType), optional, is_metadata – Label to set on the process node.
        • options, Namespace
          • account, (str, NoneType), optional, is_metadata – Set the account to use in for the queue on the remote computer
          • additional_retrieve_list, (list, tuple, NoneType), optional, is_metadata – List of relative file paths that should be retrieved in addition to what the plugin specifies.
          • append_text, str, optional, is_metadata – Set the calculation-specific append text, which is going to be appended in the scheduler-job script, just after the code execution
          • custom_scheduler_commands, str, optional, is_metadata – Set a (possibly multiline) string with the commands that the user wants to manually set for the scheduler. The difference of this option with respect to the prepend_text is the position in the scheduler submission file where such text is inserted: with this option, the string is inserted before any non-scheduler command
          • environment_variables, dict, optional, is_metadata – Set a dictionary of custom environment variables for this calculation
          • environment_variables_double_quotes, bool, optional, is_metadata – If set to True, use double quotes instead of single quotes to escape the environment variables specified in environment_variables.
          • import_sys_environment, bool, optional, is_metadata – If set to true, the submission script will load the system environment variables
          • input_filename, str, optional, is_metadata
          • max_memory_kb, (int, NoneType), optional, is_metadata – Set the maximum memory (in KiloBytes) to be asked to the scheduler
          • max_wallclock_seconds, (int, NoneType), optional, is_metadata – Set the wallclock in seconds asked to the scheduler
          • mpirun_extra_params, (list, tuple), optional, is_metadata – Set the extra params to pass to the mpirun (or equivalent) command after the one provided in computer.mpirun_command. Example: mpirun -np 8 extra_params[0] extra_params[1] … exec.x
          • output_filename, str, optional, is_metadata
          • parser_name, str, optional, is_metadata
          • prepend_text, str, optional, is_metadata – Set the calculation-specific prepend text, which is going to be prepended in the scheduler-job script, just before the code execution
          • priority, (str, NoneType), optional, is_metadata – Set the priority of the job to be queued
          • qos, (str, NoneType), optional, is_metadata – Set the quality of service to use in for the queue on the remote computer
          • queue_name, (str, NoneType), optional, is_metadata – Set the name of the queue on the remote computer
          • rerunnable, (bool, NoneType), optional, is_metadata – Determines if the calculation can be requeued / rerun.
          • resources, dict, required, is_metadata – Set the dictionary of resources to be used by the scheduler plugin, like the number of nodes, cpus etc. This dictionary is scheduler-plugin dependent. Look at the documentation of the scheduler for more details.
          • scheduler_stderr, str, optional, is_metadata – Filename to which the content of stderr of the scheduler is written.
          • scheduler_stdout, str, optional, is_metadata – Filename to which the content of stdout of the scheduler is written.
          • stash, Namespace – Optional directives to stash files after the calculation job has completed.
            • source_list, (tuple, list, NoneType), optional, is_metadata – Sequence of relative filepaths representing files in the remote directory that should be stashed.
            • stash_mode, (str, NoneType), optional, is_metadata – Mode with which to perform the stashing, should be value of aiida.common.datastructures.StashMode.
            • target_base, (str, NoneType), optional, is_metadata – The base location to where the files should be stashd. For example, for the copy stash mode, this should be an absolute filepath on the remote computer.
          • submit_script_filename, str, optional, is_metadata – Filename to which the job submission script is written.
          • withmpi, bool, optional, is_metadata
        • store_provenance, bool, optional, is_metadata – If set to False provenance will not be stored in the database.
      • monitors, Namespace – Add monitoring functions that can inspect output files while the job is running and decide to prematurely terminate the job.
      • parameters, Dict, required – Phonopy parameters (setting tags) for post processing. The following tags, along their type, are allowed: PRIMITIVE_AXES PRIMITIVE_AXIS EIGENVECTORS BAND BAND_PATHS BAND_POINTS BAND_LABELS BAND_CONNECTION BAND_INDICES MESH MP MESH_NUMBERS MP_SHIFT GAMMA_CENTER WRITE_MESH DOS DOS_RANGE FMIN FMAX FPITCH PDOS PROJECTION_DIRECTION XYZ_DIRECTION SIGMA DEBYE_MODEL MOMEMT MOMENT_ORDER TPROP TMIN TMAX TSTEP PRETEND_REAL CUTOFF_FREQUENCY TDISP TDISPMAT TDISPMAT_CIF QPOINTS WRITEDM NAC_METHOD Q_DIRECTION GROUP_VELOCITY GV_DELTA_Q SYMMETRY_TOLERANCE SYMMETRY MESH_SYMMETRY FC_SYMMETRY FULL_FORCE_CONSTANTS WRITE_FORCE_CONSTANTS ANIME_TYPE ANIME MODULATION IRREPS SHOW_IRREPS LITTLE_COGROUP
      • remote_folder, (RemoteData, NoneType), optional – Remote directory containing the results of an already completed calculation job without AiiDA. The inputs should be passed to the CalcJob as normal but instead of launching the actual job, the engine will recreate the input files and then proceed straight to the retrieve step where the files of this RemoteData will be retrieved as if it had been actually launched through AiiDA. If a parser is defined in the inputs, the results are parsed and attached as output nodes as usual.
      • settings, (Dict, NoneType), optional – Settings for phonopy calculation.
    • primitive_matrix, (List, NoneType), optional – Primitive matrix that defines the primitive cell from the unitcell.
    • scf, Namespace – Inputs for the PwBaseWorkChain that will be used to run the electric enthalpy scfs.
      • handler_overrides, (Dict, NoneType), optional – Mapping where keys are process handler names and the values are a dictionary, where each dictionary can define the enabled and priority key, which can be used to toggle the values set on the original process handler declaration.
      • kpoints, (KpointsData, NoneType), optional – An explicit k-points list or mesh. Either this or kpoints_distance has to be provided.
      • kpoints_distance, (Float, NoneType), optional – The minimum desired distance in 1/Å between k-points in reciprocal space. The explicit k-points will be generated automatically by a calculation function based on the input structure.
      • kpoints_force_parity, (Bool, NoneType), optional – Optional input when constructing the k-points based on a desired kpoints_distance. Setting this to True will force the k-point mesh to have an even number of points along each lattice vector except for any non-periodic directions.
      • max_iterations, Int, optional – Maximum number of iterations the work chain will restart the process to finish successfully.
      • metadata, Namespace
        • call_link_label, str, optional, is_metadata – The label to use for the CALL link if the process is called by another process.
        • description, (str, NoneType), optional, is_metadata – Description to set on the process node.
        • label, (str, NoneType), optional, is_metadata – Label to set on the process node.
        • store_provenance, bool, optional, is_metadata – If set to False provenance will not be stored in the database.
      • pw, Namespace
        • code, (AbstractCode, NoneType), optional – The Code to use for this job. This input is required, unless the remote_folder input is specified, which means an existing job is being imported and no code will actually be run.
        • hubbard_file, (SinglefileData, NoneType), optional – SinglefileData node containing the output Hubbard parameters from a HpCalculation
        • metadata, Namespace
          • call_link_label, str, optional, is_metadata – The label to use for the CALL link if the process is called by another process.
          • computer, (Computer, NoneType), optional, is_metadata – When using a “local” code, set the computer on which the calculation should be run.
          • description, (str, NoneType), optional, is_metadata – Description to set on the process node.
          • dry_run, bool, optional, is_metadata – When set to True will prepare the calculation job for submission but not actually launch it.
          • label, (str, NoneType), optional, is_metadata – Label to set on the process node.
          • options, Namespace
            • account, (str, NoneType), optional, is_metadata – Set the account to use in for the queue on the remote computer
            • additional_retrieve_list, (list, tuple, NoneType), optional, is_metadata – List of relative file paths that should be retrieved in addition to what the plugin specifies.
            • append_text, str, optional, is_metadata – Set the calculation-specific append text, which is going to be appended in the scheduler-job script, just after the code execution
            • custom_scheduler_commands, str, optional, is_metadata – Set a (possibly multiline) string with the commands that the user wants to manually set for the scheduler. The difference of this option with respect to the prepend_text is the position in the scheduler submission file where such text is inserted: with this option, the string is inserted before any non-scheduler command
            • environment_variables, dict, optional, is_metadata – Set a dictionary of custom environment variables for this calculation
            • environment_variables_double_quotes, bool, optional, is_metadata – If set to True, use double quotes instead of single quotes to escape the environment variables specified in environment_variables.
            • import_sys_environment, bool, optional, is_metadata – If set to true, the submission script will load the system environment variables
            • input_filename, str, optional, is_metadata
            • max_memory_kb, (int, NoneType), optional, is_metadata – Set the maximum memory (in KiloBytes) to be asked to the scheduler
            • max_wallclock_seconds, (int, NoneType), optional, is_metadata – Set the wallclock in seconds asked to the scheduler
            • mpirun_extra_params, (list, tuple), optional, is_metadata – Set the extra params to pass to the mpirun (or equivalent) command after the one provided in computer.mpirun_command. Example: mpirun -np 8 extra_params[0] extra_params[1] … exec.x
            • output_filename, str, optional, is_metadata
            • parser_name, str, optional, is_metadata
            • prepend_text, str, optional, is_metadata – Set the calculation-specific prepend text, which is going to be prepended in the scheduler-job script, just before the code execution
            • priority, (str, NoneType), optional, is_metadata – Set the priority of the job to be queued
            • qos, (str, NoneType), optional, is_metadata – Set the quality of service to use in for the queue on the remote computer
            • queue_name, (str, NoneType), optional, is_metadata – Set the name of the queue on the remote computer
            • rerunnable, (bool, NoneType), optional, is_metadata – Determines if the calculation can be requeued / rerun.
            • resources, dict, required, is_metadata – Set the dictionary of resources to be used by the scheduler plugin, like the number of nodes, cpus etc. This dictionary is scheduler-plugin dependent. Look at the documentation of the scheduler for more details.
            • scheduler_stderr, str, optional, is_metadata – Filename to which the content of stderr of the scheduler is written.
            • scheduler_stdout, str, optional, is_metadata – Filename to which the content of stdout of the scheduler is written.
            • stash, Namespace – Optional directives to stash files after the calculation job has completed.
              • source_list, (tuple, list, NoneType), optional, is_metadata – Sequence of relative filepaths representing files in the remote directory that should be stashed.
              • stash_mode, (str, NoneType), optional, is_metadata – Mode with which to perform the stashing, should be value of aiida.common.datastructures.StashMode.
              • target_base, (str, NoneType), optional, is_metadata – The base location to where the files should be stashd. For example, for the copy stash mode, this should be an absolute filepath on the remote computer.
            • submit_script_filename, str, optional, is_metadata – Filename to which the job submission script is written.
            • withmpi, bool, optional, is_metadata
            • without_xml, (bool, NoneType), optional, is_metadata – If set to True the parser will not fail if the XML file is missing in the retrieved folder.
          • store_provenance, bool, optional, is_metadata – If set to False provenance will not be stored in the database.
        • monitors, Namespace – Add monitoring functions that can inspect output files while the job is running and decide to prematurely terminate the job.
        • parallelization, (Dict, NoneType), optional – Parallelization options. The following flags are allowed: npool : The number of ‘pools’, each taking care of a group of k-points. nband : The number of ‘band groups’, each taking care of a group of Kohn-Sham orbitals. ntg : The number of ‘task groups’ across which the FFT planes are distributed. ndiag : The number of ‘linear algebra groups’ used when parallelizing the subspace diagonalization / iterative orthonormalization. By default, no parameter is passed to Quantum ESPRESSO, meaning it will use its default.
        • parameters, Dict, required – The input parameters that are to be used to construct the input file.
        • pseudos, Namespace – A mapping of UpfData nodes onto the kind name to which they should apply.
        • remote_folder, (RemoteData, NoneType), optional – Remote directory containing the results of an already completed calculation job without AiiDA. The inputs should be passed to the CalcJob as normal but instead of launching the actual job, the engine will recreate the input files and then proceed straight to the retrieve step where the files of this RemoteData will be retrieved as if it had been actually launched through AiiDA. If a parser is defined in the inputs, the results are parsed and attached as output nodes as usual.
        • settings, (Dict, NoneType), optional – Optional parameters to affect the way the calculation job and the parsing are performed.
        • vdw_table, (SinglefileData, NoneType), optional – Optional van der Waals table contained in a SinglefileData.
    • settings, Namespace – Options for how to run the workflow.
      • sleep_submission_time, (int, float), optional, non_db – Time in seconds to wait before submitting subsequent displaced structure scf calculations.
  • settings, Namespace – Options for how to run the workflow.
    • run_parallel, bool, optional, non_db – Whether to run the DielectricWorkChain and the PhononWorkChain in parallel.
    • use_primitive_cell, Bool, optional – Whether to use the primitive cell for the DielectricWorkChain. WARNING: it is not implemented for HubbardStructureData.
  • structure, StructureData, required
  • symmetry, Namespace – Namespace for symmetry related inputs.
    • distinguish_kinds, Bool, optional – Whether or not to distinguish atom with same species but different names with symmetries.
    • is_symmetry, Bool, optional – Whether using or not the space group symmetries.
    • symprec, Float, optional – Symmetry tolerance for space group analysis on the input structure.

Outputs:

  • fake, Namespace
    • ir_averaged, ArrayData, required – Contains high frequency dielectric tensor computed in Cartesian coordinates.
    • raman_averaged, ArrayData, optional – Contains Born effective charges tensors computed in Cartesian coordinates.
    • units, Dict, optional – Units of intensities and frequencies.
  • output_dielectric, Namespace – Outputs of the DielectricWorkChain.
    • accuracy_order, Int, optional
    • critical_electric_field, Float, optional
    • electric_field_step, Float, optional
    • fields_data, Namespace – Namespace for passing TrajectoryData containing forces and polarization.
      • field_index_0, Namespace
      • field_index_1, Namespace
      • field_index_2, Namespace
      • field_index_3, Namespace
      • field_index_4, Namespace
      • field_index_5, Namespace
    • tensors, Namespace – Contains high frequency dielectric and Born effectivecharges tensors computed in Cartesian coordinates. Depending on the inputs, it can also contain the derivatives of the susceptibility in respect to the atomic positions (called Raman tensors) and the non linear optical susceptibility, always expressed in Cartesian coordinates.
    • units, Dict, optional – Units of the susceptibility derivatives tensors.
  • output_intensities_average, Namespace – Intensities average over space and q-points.
  • output_phonon, Namespace – Outputs of the PhononWorkChain.
    • output_phonopy, Namespace
      • irreducible_representations, Dict, optional – Irreducible representation output.
      • modulation, Dict, optional – Modulation information.
      • output_force_constants, ArrayData, optional – Calculated force constants.
      • output_parameters, Dict, optional – Sum up info of phonopy calculation.
      • phonon_bands, BandsData, optional – Calculated phonon band structure.
      • projected_phonon_dos, XyData, optional – Calculated projected DOS.
      • qpoints, BandsData, optional – Calculated qpoints.
      • qpoints_mesh, BandsData, optional – Calculated qpoint mesh.
      • remote_folder, RemoteData, required – Input files necessary to run the process will be stored in this folder node.
      • remote_stash, RemoteStashData, optional – Contents of the stash.source_list option are stored in this remote folder after job completion.
      • retrieved, FolderData, required – Files that are retrieved by the daemon will be stored in this node. By default the stdout and stderr of the scheduler will be added, but one can add more by specifying them in CalcInfo.retrieve_list.
      • thermal_displacement_matrices, Dict, optional – Calculated thermal displacements matrices.
      • thermal_displacements, Dict, optional – Calculated thermal displacements.
      • thermal_properties, XyData, optional – Calculated thermal properties.
      • total_phonon_dos, XyData, optional – Calculated total DOS.
    • phonopy_data, PhonopyData, required – The phonopy data with supercells displacements, forces to use in the post-processing calculation.
    • supercells, Namespace – The supercells with displacements.
    • supercells_forces, Namespace – The forces acting on the atoms of each supercell.
  • vibrational_data, Namespace – The phonopy data with supercells displacements, forces and (optionally)nac parameters to use in the post-processing calculation.

Outline:

run_spectra(Run an `HarmonicWorkChain` at Gamma.)
inspect_process(Inspect that the `HarmonicWorkChain` finished successfully.)
if(should_run_average)
    run_intensities_averaged(Run an `IntensitiesAverageWorkChain` with the calculated vibrational data.)
    inspect_averaging(Inspect and expose the outputs.)