JuMP is a modeling interface and a collection of supporting packages for mathematical optimization that is embedded in Julia. With JuMP, users formulate various classes of optimization problems with easy-to-read code, and then solve these problems using state-of-the-art open-source and commercial solvers. JuMP also makes advanced optimization techniques easily accessible from a high-level language.
JuMP will be participating as a sub-organization in NumFOCUS's application for Google Summer of Code 2026.
For more information about this application see: NumFOCUS's main GSoC page
All GSoC-related issues and pull requests must disclose AI usage.
If AI tools were used, end the issue or PR with a sentence beginning:
AI assistance was used for ...
If no AI tools were used at all, end with the sentence:
No AI tools were used in this contribution.
Submissions that omit a disclosure may be closed without review.
MathOptAI.jl is a package for embedding trained machine learning predictors into JuMP models. The field is moving fast, and many new models and formulations are being proposed. The goal is this project is to add support for new predictors to MathOptAI.jl so that it remains state-of-the-art.
The contributor will conduct a literature review of the field to construct a prioritized list new formulations that are of interest to the mathematical optimization community. Then, as time permits, they will add new predictors to MathOptAI.jl.
For each new predictor added to MathOptAI.jl, the contributor must add:
- tests that achieve 100% code coverage
- integration with relevant package extensions
- documentation, including new tutorials that use the predictor
There are two strong prerequisites:
- Strong knowledge of mathematical optimization (an ideal candidate will be enrolled in a Ph.D. course in optimization)
- Strong knowledge of Julia and JuMP (an ideal candidate will have prior experience contributing to open-source Julia repositories that depend on JuMP or MathOptInterface)
and two weak prerequisites:
- Basic knowledge of Python
- Basic knowledge of machine learning frameworks such as PyTorch
Candidates looking to apply for this project are strongly discouraged from opening issues in MathOptAI.jl (pull requests will be closed without review). Instead, we would prefer candidates submit in their application a portfolio of work on other JuMP/Julia based projects.
- 175 hours (Medium): at least four (4) new predictors
- 350 hours (Large): at least eight (8) new predictors
JuMPConverter.jl is a converter for AMPL™ models
in .mod and GAMS™ models in .gms to JuMP models.
The goal of this project is to extend the AMPL converter to support a broader range of AMPL
language features, enabling conversion of well-known benchmark collections that are widely
used in the optimization community.
The contributor will extend the AMPL converter in JuMPConverter.jl to handle the AMPL
constructs required by benchmark collections. This includes support for AMPL .mod files
with associated .dat data files, complementarity constraints (as used in MPECs), and
the nonlinear expressions commonly found in standard test problems. The work will be
validated against real-world benchmark sets with known solutions.
Given AMPL models as input, the converter generates equivalent Julia code using the JuMP framework. Algebraic expressions are delegated to Julia's native parser, simplifying the translation process and improving reliability.
Note: Direct reading of an AMPL model into a MathOptInterface (MOI) model—thereby implementing the MathOptInterface.FileFormats API is out of the scope and is not a requirement for completing the proposal; see JuMPConverter.jl Issue #4.
For the AMPL converter improvements, the contributor must add:
- tests that achieve high code coverage for new AMPL constructs
- support for reading
.modfiles with.datdata files - documentation of supported and unsupported AMPL features
- integration tests using models from MacMPEC.jl and CUTE.jl
The following skills would be valuable for the project:
- Good knowledge of mathematical optimization (MPECs, nonlinear programming)
- Good knowledge of Julia and JuMP
- Basic knowledge of parsing techniques
- Familiarity with AMPL syntax and modeling
- 175 hours (Medium): extend the AMPL converter to successfully convert all models in MacMPEC.jl (the MacMPEC collection of Mathematical Programs with Equilibrium Constraints in AMPL format)
- 350 hours (Large): in addition to MacMPEC, successfully convert all models in CUTE.jl (the CUTE constrained and unconstrained testing environment benchmark set)