Welcome to DGPT
Last updated
Last updated
Introducing Decentralized Generative Pre-Trained Transformers (DGPT), the decentralized competitor to Chat-GPT. Unlike Chat-GPT who are closed source, DGPT is fully open source and Decentralized.
What sets DGPT apart from the AI crowd is our powerful algorithm created to define and model inputs to produce the correct output. There is no using API's to other LLM's (Large Language Models), DGPT stands in it's own right with it's own unique algorithm.
To support our powerful innovation in Decentralized LLM we present $DGPT, an ERC20 token deployed on the Ethereum network with rich tokenomics.
Token holders are empowered through being able to participate in providing proposals and the governance of DGPT. The $DGPT token holders make up the community that will foster and drive the new age of Large Language Modelling in AI.
DGPT (Decentralized Generative Pre-Trained Transformers) is developing decentralized generative pre-trained transformers for which no company will have an exclusive license and so that everyone has access to the underlying code for the benefit of all. DGPT-1 is an autoregressive language model that uses deep learning to produce human-like text. It is the first-generation language prediction model in the DGPT-n series, which intends to be the first NLP with trillion(s) of machine learning parameters, enabled through outsourced distributed computing. DGPT-1 intends to be the largest non-sparse language model with increased capacity and higher number of parameters than any other system including the GPT-n series.
DGPT-1 has a weighted average across a finite set of data or compressed data such as
ā¢ Common Crawl byte-pair-encoded tokens. ā¢ WebText2 byte-pair-endcoded tokens. ā¢ Books1/Books2 byte-pair-endcoded tokens. ā¢ Wikipedia article byte-pair-endcoded tokens. ā¢ Additional platforms unique to DGPT
DGPT-1 will be trained on trillions of words and will be capable of coding in CSS, JSX, Python, and other programming languages, and is designed to eliminate toxic language, which can perform zero-shot, few-shot and one-shot learning.