paint-brush
MyanmarGPT-Big: Breaking Grounds in Language Processing - How to Generate Burmese Textby@minsithu
138 reads

MyanmarGPT-Big: Breaking Grounds in Language Processing - How to Generate Burmese Text

by Min Si ThuJanuary 16th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

MyanmarGPT is the first and largest usable Burmese Language Generative Pretrained Transformer. Developed by Min Si Thu, these models are supported by robust and well-documented code. MyanmarGPT-Big, with 1.42 billion parameters, caters to enterprise-level language processing.

People Mentioned

Mention Thumbnail
featured image - MyanmarGPT-Big: Breaking Grounds in Language Processing - How to Generate Burmese Text
Min Si Thu HackerNoon profile picture

MyanmarGPT, the first and largest usable Burmese Language Generative Pretrained Transformer, represents a significant milestone in Myanmar's foray into artificial intelligence. Developed by Min Si Thu, these models are not just technological wonders but are also supported by robust and well-documented code, making them accessible and user-friendly for developers.

MyanmarGPT: A Fusion of Power and Clarity

  1. Free to Use and Open-Source: MyanmarGPT and MyanmarGPT-Big are open-source models, allowing developers to freely explore, contribute, and integrate them into their projects. You can access MyanmarGPT here and MyanmarGPT-Big here.
  2. Lightweight and Accurate: MyanmarGPT's 128 million parameters ensure a lightweight design that's easy to deploy across all devices without compromising accuracy. Meanwhile, MyanmarGPT-Big, with 1.42 billion parameters, caters to enterprise-level language processing, offering precision and versatility.
  3. Burmese + International Languages: MyanmarGPT supports a total of 61 languages, prioritizing the Burmese language while embracing international diversity. This multilingual capability positions it as a valuable resource for a wide range of developers.
  4. Community-Driven Development: The success of MyanmarGPT is fueled by community contributions. Under the guidance of Min Si Thu, these models continuously evolve, ensuring their relevance and effectiveness across various applications.

Unveiling MyanmarGPT Models:

MyanmarGPT - 128 M Parameters

MyanmarGPT, with its lightweight design, is suitable for a variety of applications. Below is an example of how to use it with the Hugging Face Transformers library.

from transformers import pipeline, AutoTokenizer, AutoModelForCausalLM

# Using Pipeline
pipe_gpt = pipeline("text-generation", model="jojo-ai-mst/MyanmarGPT")
outputs_gpt = pipe_gpt("အီတလီ", do_sample=False)
print(outputs_gpt)

# Using AutoTokenizer and CausalLM
tokenizer_gpt = AutoTokenizer.from_pretrained("jojo-ai-mst/MyanmarGPT")
model_gpt = AutoModelForCausalLM.from_pretrained("jojo-ai-mst/MyanmarGPT")

input_ids_gpt = tokenizer_gpt.encode("ချစ်သား", return_tensors='pt')
output_gpt = model_gpt.generate(input_ids_gpt, max_length=50)
print(tokenizer_gpt.decode(output_gpt[0], skip_special_tokens=True))

MyanmarGPT-Big - 1.42 B Parameters

MyanmarGPT-Big, designed for enterprise-level language modeling, currently supports 61 languages. Below is an example of how to use it with the Hugging Face Transformers library.

from transformers import pipeline, AutoTokenizer, AutoModelForCausalLM

# Using Pipeline
pipe_big = pipeline("text-generation", model="jojo-ai-mst/MyanmarGPT-Big")
outputs_big = pipe_big("အီတလီ", do_sample=False)
print(outputs_big)

# Using AutoTokenizer and CausalLM
tokenizer_big = AutoTokenizer.from_pretrained("jojo-ai-mst/MyanmarGPT-Big")
model_big = AutoModelForCausalLM.from_pretrained("jojo-ai-mst/MyanmarGPT-Big")

input_ids_big = tokenizer_big.encode("ချစ်သား", return_tensors='pt')
output_big = model_big.generate(input_ids_big, max_length=50)
print(tokenizer_big.decode(output_big[0], skip_special_tokens=True))

Acknowledging Contributors:

The success of MyanmarGPT is a collaborative effort, and we extend our gratitude to Min Si Thu and the vibrant community of contributors who have played a crucial role in shaping and refining these models.


Conclusion: MyanmarGPT is not just a language model; it's a tool designed for developers, supported by clear and comprehensive code documentation. As Myanmar embraces artificial intelligence, MyanmarGPT stands as a symbol of progress and inclusivity, offering the community the resources needed to push the boundaries of technology in Myanmar and beyond.


Also appears here.