mistralai / mamba-codestral-7b-v0.1

Model Overview

Description:

Codestral Mamba is trained on a diverse dataset of 80+ programming languages, including the most popular ones, such as Python, Java, C, C++, JavaScript, and Bash. It also performs well on more specific ones like Swift and Fortran. This broad language base ensures Codestral Mamba can assist developers in various coding environments and projects.

This model is ready for commercial use and testing purposes.

Third-Party Community Consideration:

This model is not owned or developed by NVIDIA. This model has been developed and built to a third-party’s requirements for this application and use case; see Codestral's Mamba 7B Hugging Face Model Card.

Terms of Use

By using this software or model, you are agreeing to the terms and conditions of the license, acceptable use policy and Mistral's privacy policy. Mamba-Codestral-7B-v0.1 is released under the Apache 2.0 license.

References(s):

Mamba Codestral 7B blogpost

Model Architecture:

Architecture Type: Transformer

Network Architecture: Mamba Codestral 7B v0.1

Model Version: 0.1

Input:

Input Format: Text

Input Parameters: Max Tokens, Temperature, Top P

Max Input Tokens: 4096

Output:

Output Format: Text

Output Parameters: None

Max Output Tokens: 4096

Software Integration:

Supported Hardware Platform(s): NVIDIA Ampere, NVIDIA Hopper, NVIDIA Turing

Supported Operating System(s): Linux

Inference:

Engine: TRT-LLM

Test Hardware: L40S

Ethical Considerations

NVIDIA believes Trustworthy AI is a shared responsibility and we have established policies and practices to enable development for a wide array of AI applications. When downloaded or used in accordance with our terms of service, developers should work with their internal model team to ensure this model meets requirements for the relevant industry and use case and addresses unforeseen product misuse. Please report security vulnerabilities or NVIDIA AI Concerns here.