Repository navigation
starcraft2
- Website
- Wikipedia
Reinforcement Learning Coach by Intel AI Lab enables easy experimentation with state of the art Reinforcement Learning algorithms
An artificial intelligence platform for the StarCraft II with large-scale distributed training and grand-master agents.
Decentralize, Self-host Cloud Gaming/Application
XuanCe: A Comprehensive and Unified Deep Reinforcement Learning Library
A StarCraft II bot api client library for Python 3
Reaver: Modular Deep Reinforcement Learning Framework. Focused on StarCraft II. Supports Gym, Atari, and MuJoCo.
A StarCraft II bot api client library for Python 3
(JAIR'2022) A mini-scale reproduction code of the AlphaStar program. Note: the original AlphaStar is the AI proposed by DeepMind to play StarCraft II. JAIR = Journal of Artificial Intelligence Research.
We extend pymarl2 to pymarl3, equipping the MARL algorithms with permutation invariance and permutation equivariance properties. The enhanced algorithm achieves 100% win rates on SMAC-V1 and superior performance on SMAC-V2.
Extracts gameplay information from Starcraft II replay files
Startcraft II Machine Learning research with DeepMind pysc2 python library .mini-games and agents.
LLM-PySC2 is NKAI Decision Team and NUDT Decision Team's Python component of the StarCraft II LLM Decision Environment. It exposes Deepmind's PySC2 Learning Environment API as a Python LLM Environment.
Python library for reading MPQ archives.
Implementing reinforcement-learning algorithms for pysc2 -environment
Python framework for rapid development of Starcraft 2 AI bots
StarCraft II Multi Agent Challenge : QMIX, COMA, LIIR, QTRAN, Central V, ROMA, RODE, DOP, Graph MIX
Go client library for Blizzard API data
RTS style building placement in Unity
StarCraft II Client C++ library, proud fork of Blizzard/s2client-api.