Sparkle
A Programming by Optimisation (PbO)-based problem-solving platform designed to enable the widespread and effective use of PbO techniques for improving the state-of-the-art in solving a broad range of prominent AI problems, including SAT and AI Planning.
Specifically, Sparkle facilitates the use of:
Automated algorithm configuration
Automated algorithm selection
Furthermore, Sparkle handles various tasks for the user such as:
Algorithm meta information collection and statistics calculation
Instance/Data Set management and feature extraction
Compute cluster job submission and monitoring
Log file collection
Installation
The quick and full installation of Sparkle can be done using Conda (For Conda installation see here).
Simply download the environment.yml
file from the Github with wget:
wget https://raw.githubusercontent.com/ADA-research/Sparkle/main/environment.yml
and run:
conda env create -f environment.yml
The installation of the environment may take up to five minutes depending on your internet connection. Once the environment has been created it can be activated by:
conda activate sparkle
Note
The creation of the Conda environment also takes care of the installation of the Sparkle package itself.
Note
You will need to reactivate the environment every time you start the terminal, before using Sparkle.
Sparkle can also be installed as a standalone package using Pip. We recommend creating a new virtual environment (For example, venv) before to ensure no clashes between dependencies occur.
pip install SparkleAI
Note that a direct installation through Pip does not handle certain dependencies of the Sparkle CLI, such as the required libraries for compiling RunSolver.
Install dependencies
Asside from several package dependencies, Sparkle’s package / CLI relies on a few user supplied executables:
LaTex
compiler (pdflatex) for report generationJava
, tested with version 1.8.0_402, in order to use SMAC2R
4.3.1, in order to use IRACE
Other dependencies are handled by the Conda environment, but if that is not an option for you please ensure you have the following:
libnuma and numactl for Runsolver compilation which sparkle uses to measure solvers meta data. This is restricted to Linux based systems.
For detailed installation instructions see the documentation: https://ada-research.github.io/Sparkle/
Developer installation
The file dev-env.yml
is used for developer mode of the Sparkle package and contains several extra packages for testing.
The two environments can be created in parallel since one is named sparkle
and the other sparkle-dev
. If you want to update an environment it is better to do a clean installation by removing and recreating it. For example:
conda deactivate
conda env remove -n sparkle
conda env create -f environment.yml
conda activate sparkle
This should be fast as both conda
and pip
use local cache for the packages.
Examples
See the Examples
directory for some examples on how to use Sparkle
. All Sparkle CLI commands need to be executed from the root of the initialised Sparkle directory.
Documentation
The documentation can be read at https://ada-research.github.io/Sparkle/.
A PDF
is also available in the repository.
Licensing
Sparkle is distributed under the MIT licence
Component licences
Sparkle is distributed with a number of external components, solvers, and instance sets. Descriptions and licensing information for each these are included in the sparkle/Components
and Examples/Resources/
directories.
The SATzilla 2012 feature extractor is used from http://www.cs.ubc.ca/labs/beta/Projects/SATzilla/
with some modifications. The main modification of this component is to disable calling the SAT instance preprocessor called SatELite. It is located in: Examples/Resources/Extractors/SAT-features-competition2012_revised_without_SatELite_sparkle/
Citation
If you use Sparkle for one of your papers and want to cite it, please cite our paper describing Sparkle: K. van der Blom, H. H. Hoos, C. Luo and J. G. Rook, Sparkle: Toward Accessible Meta-Algorithmics for Improving the State of the Art in Solving Challenging Problems, in IEEE Transactions on Evolutionary Computation, vol. 26, no. 6, pp. 1351-1364, Dec. 2022, doi: 10.1109/TEVC.2022.3215013.
@article{BloEtAl22,
title={Sparkle: Toward Accessible Meta-Algorithmics for Improving the State of the Art in Solving Challenging Problems},
author={van der Blom, Koen and Hoos, Holger H. and Luo, Chuan and Rook, Jeroen G.},
journal={IEEE Transactions on Evolutionary Computation},
year={2022},
volume={26},
number={6},
pages={1351--1364},
doi={10.1109/TEVC.2022.3215013}
}
Maintainers
Thijs Snelleman, Jeroen Rook, Holger H. Hoos,
Contributors
Chuan Luo, Richard Middelkoop, Jérémie Gobeil, Sam Vermeulen, Marcel Baumann, Jakob Bossek, Tarek Junied, Yingliu Lu, Malte Schwerin, Aaron Berger, Marie Anastacio, Aaron Berger Koen van der Blom, Noah Peil, Brian Schiller
Contact
sparkle@aim.rwth-aachen.de
Sponsors
The development of Sparkle is partially sponsored by the Alexander von Humboldt foundation.
- about
- configurator
- instance
- platform
CommandName
SettingState
Settings
Settings.add_slurm_extra_option()
Settings.check_settings_changes()
Settings.get_ablation_racing_flag()
Settings.get_configurator_max_iterations()
Settings.get_configurator_number_of_runs()
Settings.get_configurator_settings()
Settings.get_configurator_solver_calls()
Settings.get_general_check_interval()
Settings.get_general_extractor_cutoff_time()
Settings.get_general_solution_verifier()
Settings.get_general_sparkle_configurator()
Settings.get_general_sparkle_objectives()
Settings.get_general_sparkle_selector()
Settings.get_general_target_cutoff_time()
Settings.get_general_verbosity()
Settings.get_irace_first_test()
Settings.get_irace_max_experiments()
Settings.get_irace_max_iterations()
Settings.get_irace_max_time()
Settings.get_irace_mu()
Settings.get_number_of_jobs_in_parallel()
Settings.get_parallel_portfolio_check_interval()
Settings.get_parallel_portfolio_number_of_seeds_per_solver()
Settings.get_run_on()
Settings.get_slurm_extra_options()
Settings.get_slurm_max_parallel_runs_per_node()
Settings.get_smac2_cpu_time()
Settings.get_smac2_max_iterations()
Settings.get_smac2_target_cutoff_length()
Settings.get_smac2_use_cpu_time_in_tunertime()
Settings.get_smac2_wallclock_time()
Settings.read_settings_ini()
Settings.set_ablation_racing_flag()
Settings.set_configurator_max_iterations()
Settings.set_configurator_number_of_runs()
Settings.set_configurator_solver_calls()
Settings.set_general_check_interval()
Settings.set_general_extractor_cutoff_time()
Settings.set_general_solution_verifier()
Settings.set_general_sparkle_configurator()
Settings.set_general_sparkle_objectives()
Settings.set_general_sparkle_selector()
Settings.set_general_target_cutoff_time()
Settings.set_general_verbosity()
Settings.set_irace_first_test()
Settings.set_irace_max_experiments()
Settings.set_irace_max_iterations()
Settings.set_irace_max_time()
Settings.set_irace_mu()
Settings.set_number_of_jobs_in_parallel()
Settings.set_parallel_portfolio_check_interval()
Settings.set_parallel_portfolio_number_of_seeds_per_solver()
Settings.set_run_on()
Settings.set_slurm_max_parallel_runs_per_node()
Settings.set_smac2_cpu_time()
Settings.set_smac2_max_iterations()
Settings.set_smac2_target_cutoff_length()
Settings.set_smac2_use_cpu_time_in_tunertime()
Settings.set_smac2_wallclock_time()
Settings.write_settings_ini()
Settings.write_used_settings()
- solver
- structures
FeatureDataFrame
FeatureDataFrame.add_extractor()
FeatureDataFrame.add_instances()
FeatureDataFrame.extractors
FeatureDataFrame.get_feature_groups()
FeatureDataFrame.get_instance()
FeatureDataFrame.get_value()
FeatureDataFrame.has_missing_value()
FeatureDataFrame.has_missing_vectors()
FeatureDataFrame.impute_missing_values()
FeatureDataFrame.instances
FeatureDataFrame.remaining_jobs()
FeatureDataFrame.remove_extractor()
FeatureDataFrame.remove_instances()
FeatureDataFrame.reset_dataframe()
FeatureDataFrame.save_csv()
FeatureDataFrame.set_value()
FeatureDataFrame.sort()
FeatureDataFrame.to_autofolio()
PerformanceDataFrame
PerformanceDataFrame.add_instance()
PerformanceDataFrame.add_solver()
PerformanceDataFrame.best_instance_performance()
PerformanceDataFrame.best_performance()
PerformanceDataFrame.clean_csv()
PerformanceDataFrame.copy()
PerformanceDataFrame.get_job_list()
PerformanceDataFrame.get_solver_ranking()
PerformanceDataFrame.get_value()
PerformanceDataFrame.get_values()
PerformanceDataFrame.has_missing_values
PerformanceDataFrame.instances
PerformanceDataFrame.marginal_contribution()
PerformanceDataFrame.mean()
PerformanceDataFrame.multi_objective
PerformanceDataFrame.num_instances
PerformanceDataFrame.num_objectives
PerformanceDataFrame.num_runs
PerformanceDataFrame.num_solvers
PerformanceDataFrame.objective_names
PerformanceDataFrame.remaining_jobs()
PerformanceDataFrame.remove_instance()
PerformanceDataFrame.remove_solver()
PerformanceDataFrame.reset_value()
PerformanceDataFrame.save_csv()
PerformanceDataFrame.schedule_performance()
PerformanceDataFrame.set_value()
PerformanceDataFrame.solvers
PerformanceDataFrame.to_autofolio()
PerformanceDataFrame.verify_indexing()
PerformanceDataFrame.verify_objective()
PerformanceDataFrame.verify_run_id()
- tools
- types