Use GPT-2 for Text-generation
-
Updated
Apr 22, 2023 - Jupyter Notebook
Use GPT-2 for Text-generation
GPT-2 version featuring a maximum number of training iterations. (Meant to be integrated with another GPT-2 project I'm developing.)
Small application to test out some functionality of OpenAIs Generative Pre-Trained Transformer (GPT-2) Model
This repository is aimed to showcasing the potential of machine learning models in generating Python code based on user input. The project utilizes the GitHub API to collect Python repositories, preprocesses the data, trains a GPT-2 language model, and generates Python code using the trained model.
A GPT2 model made from scratch on PyTorch (Inspired by Andrej Karpathy)
This repository contains transformer implementation in pytorch , taught by Andrej Karpathy
Platform for generating lyrics, songs, and chat with late artists. Adaptable to any artist with data. Currently: Ahmet Kaya.
A method for generating high quality fake text automatically with scale
Implementation of Deep Learning based Language Models from scratch in PyTorch
Streamlit app of Homer Simpson personality chatbot.
Text Generation demo using GPT-2 + Gradio/FastAPI
🚀 Collection of Awesome ChatGPT Tools & Resources
A simple Discord chatbot for local GPT-2 TensorFlow models
Survey on RL via. Sequence Modeling using Decision Transformer
Add a description, image, and links to the gpt-2 topic page so that developers can more easily learn about it.
To associate your repository with the gpt-2 topic, visit your repo's landing page and select "manage topics."