Skip to content

Instantly share code, notes, and snippets.

View TheodoreGalanos's full-sized avatar

Theodore Galanos TheodoreGalanos

View GitHub Profile
@kalomaze
kalomaze / modeling_mixtral.py
Created May 5, 2024 03:38
Fixed Mixtral training code for HF Transformers
# coding=utf-8
# Copyright 2023 Mixtral AI and the HuggingFace Inc. team. All rights reserved.
#
# This code is based on EleutherAI's GPT-NeoX library and the GPT-NeoX
# and OPT implementations in this library. It has been modified from its
# original forms to accommodate minor architectural differences compared
# to GPT-NeoX and OPT used by the Meta AI team that trained the model.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.