Metadata-Version: 2.1
Name: happytransformer
Version: 2.2.0
Summary: Happy Transformer is an API built on top of Hugging Face's Transformer library that makes it easy to utilize state-of-the-art NLP models.
Home-page: https://github.com/EricFillion/happy-transformer
Author: The Happy Transformer Development Team
Author-email: happytransformer@gmail.com
License: Apache 2.0
Description: [![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0) 
        [![Downloads](https://pepy.tech/badge/happytransformer)](https://pepy.tech/project/happytransformer)
        [![Website shields.io](https://img.shields.io/website-up-down-green-red/http/shields.io.svg)](http://happytransformer.com)
        
        # Happy Transformer 
        **Documentation and news: 
        [happytransformer.com](http://happytransformer.com)**
        
        ![HappyTransformer](logo.png)
        
        Happy Transformer is an package built on top of [Hugging Face's transformer library](https://huggingface.co/transformers/) that makes it easy to utilize state-of-the-art NLP models. 
        
        ## Features 
          
        | Public Methods                     | Basic Usage  | Training   |
        |------------------------------------|--------------|------------|
        | Word Prediction                    | ✔            | ✔          |
        | Text Generation                    | ✔            | ✔          |
        | Text Classification                | ✔            | ✔          | 
        | Question Answering                 | ✔            | ✔          | 
        | Next Sentence Prediction           | ✔            |            | 
        | Token Classification               | ✔            |            | 
        
        ## Quick Start
        ```sh
        pip install happytransformer
        ```
        
        ```python
        
        from happytransformer import HappyWordPrediction
        #--------------------------------------#
            happy_wp = HappyWordPrediction()  # default uses distilbert-base-uncased
            result = happy_wp.predict_mask("I think therefore I [MASK]")
            print(result)  # [WordPredictionResult(token='am', score=0.10172799974679947)]
            print(result[0].token)  # am
        
        
        ## Maintainers
        - [Eric Fillion](https://github.com/ericfillion)  Lead Maintainer
        - [Ted Brownlow](https://github.com/ted537) Maintainer
        
Keywords: bert,roberta,xlnet,transformer,happy,HappyTransformer,classification,nlp,nlu,natural,language,processing,understanding
Platform: UNKNOWN
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Topic :: Text Processing :: Linguistic
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Description-Content-Type: text/markdown
