💬 Open source machine learning framework to automate text- and voice-based conversations: NLU, dialogue management, connect to Slack, Facebook, and more - Create chatbots and voice assistants

Rasa Open Source

Join the chat on Rasa Community Forum PyPI version Supported Python Versions Build Status Coverage Status Documentation Status Documentation Build FOSSA Status PRs Welcome

Rasa is an open source machine learning framework to automate text-and voice-based conversations. With Rasa, you can build contextual assistants on:

  • Facebook Messenger
  • Slack
  • Google Hangouts
  • Webex Teams
  • Microsoft Bot Framework
  • Rocket.Chat
  • Mattermost
  • Telegram
  • Twilio
  • Your own custom conversational channels

or voice assistants as:

  • Alexa Skills
  • Google Home Actions

Rasa helps you build contextual assistants capable of having layered conversations with lots of back-and-forth. In order for a human to have a meaningful exchange with a contextual assistant, the assistant needs to be able to use context to build on things that were previously discussed – Rasa enables you to build assistants that can do this in a scalable way.

There's a lot more background information in this blog post.



Where to get help

There is extensive documentation in the Rasa Docs. Make sure to select the correct version so you are looking at the docs for the version you installed.

Please use Rasa Community Forum for quick answers to questions.

README Contents:

How to contribute

We are very happy to receive and merge your contributions into this repository!

To contribute via pull request, follow these steps:

  1. Create an issue describing the feature you want to work on (or have a look at the contributor board)
  2. Write your code, tests and documentation, and format them with black
  3. Create a pull request describing your changes

For more detailed instructions on how to contribute code, check out these code contributor guidelines.

You can find more information about how to contribute to Rasa (in lots of different ways!) on our website..

Your pull request will be reviewed by a maintainer, who will get back to you about any necessary changes or questions. You will also be asked to sign a Contributor License Agreement.

Development Internals

Installing Poetry

Rasa uses Poetry for packaging and dependency management. If you want to build it from source, you have to install Poetry first. This is how it can be done:

curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python

There are several other ways to install Poetry. Please, follow the official guide to see all possible options.

Managing environments

The official Poetry guide suggests to use pyenv or any other similar tool to easily switch between Python versions. This is how it can be done:

pyenv install 3.7.6
pyenv local 3.7.6  # Activate Python 3.7.6 for the current project

By default, Poetry will try to use the currently activated Python version to create the virtual environment for the current project automatically. You can also create and activate a virtual environment manually — in this case, Poetry should pick it up and use it to install the dependencies. For example:

python -m venv .venv
source .venv/bin/activate

You can make sure that the environment is picked up by executing

poetry env info

Building from source

To install dependencies and rasa itself in editable mode execute

make install

Running and changing the documentation

First of all, install all the required dependencies:

make install install-docs

After the installation has finished, you can run and view the documentation locally using:

make livedocs

It should open a new tab with the local version of the docs in your browser; if not, visit http://localhost:3000 in your browser. You can now change the docs locally and the web page will automatically reload and apply your changes.

Running the Tests

In order to run the tests, make sure that you have the development requirements installed:

make prepare-tests-ubuntu # Only on Ubuntu and Debian based systems
make prepare-tests-macos  # Only on macOS

Then, run the tests:

make test

They can also be run at multiple jobs to save some time:

JOBS=[n] make test

Where [n] is the number of jobs desired. If omitted, [n] will be automatically chosen by pytest.

Resolving merge conflicts

Poetry doesn't include any solution that can help to resolve merge conflicts in the lock file poetry.lock by default. However, there is a great tool called poetry-merge-lock. Here is how you can install it:

pip install poetry-merge-lock

Just execute this command to resolve merge conflicts in poetry.lock automatically:

poetry-merge-lock

Build a Docker image locally

In order to build a Docker image on your local machine execute the following command:

make build-docker

The Docker image is available on your local machine as rasa:localdev.

Code Style

To ensure a standardized code style we use the formatter black. To ensure our type annotations are correct we use the type checker pytype. If your code is not formatted properly or doesn't type check, GitHub will fail to build.

Formatting

If you want to automatically format your code on every commit, you can use pre-commit. Just install it via pip install pre-commit and execute pre-commit install in the root folder. This will add a hook to the repository, which reformats files on every commit.

If you want to set it up manually, install black via poetry install. To reformat files execute

make formatter

Type Checking

If you want to check types on the codebase, install mypy using poetry install. To check the types execute

make types

Deploying documentation updates

We use Docusaurus v2 to build docs for tagged versions and for the main branch. The static site that gets built is pushed to the documentation branch of this repo.

We host the site on netlify. On main branch builds (see .github/workflows/documentation.yml), we push the built docs to the documentation branch. Netlify automatically re-deploys the docs pages whenever there is a change to that branch.

Releases

Release Timeline for Minor Releases

For Rasa Open Source, we usually commit to time-based releases, specifically on a monthly basis. This means that we commit beforehand to releasing a specific version of Rasa Open Source on a specific day, and we cannot be 100% sure what will go in a release, because certain features may not be ready.

At the beginning of each quarter, the Rasa team will review the scheduled release dates for all products and make sure they work for the projected work we have planned for the quarter, as well as work well across products.

Once the dates are settled upon, we update the respective milestones.

Cutting a Major / Minor release

A week before release day

  1. Make sure the milestone already exists and is scheduled for the correct date.
  2. Take a look at the issues & PRs that are in the milestone: does it look about right for the release highlights we are planning to ship? Does it look like anything is missing? Don't worry about being aware of every PR that should be in, but it's useful to take a moment to evaluate what's assigned to the milestone.
  3. Post a message on the engineering Slack channel, letting the team know you'll be the one cutting the upcoming release, as well as:
    1. Providing the link to the appropriate milestone
    2. Reminding everyone to go over their issues and PRs and please assign them to the milestone
    3. Reminding everyone of the scheduled date for the release

A day before release day

  1. Go over the milestone and evaluate the status of any PR merging that's happening. Follow up with people on their bugs and fixes. If the release introduces new bugs or regressions that can't be fixed in time, we should discuss on Slack about this and take a decision to go forward or postpone the release. The PR / issue owners are responsible for communicating any issues which might be release relevant.

Release day! 🚀

  1. At the start of the day, post a small message on slack announcing release day!. Communicate you'll be handling the release, and the time you're aiming to start releasing (again, no later than 4pm, as issues may arise and cause delays)
  2. Make sure the milestone is empty (everything has been either merged or moved to the next milestone)
  3. Once everything in the milestone is taken care of, post a small message on Slack communicating you are about to start the release process (in case anything is missing).
  4. You may now do the release by following the instructions outlined in the Rasa Open Source README !

Steps to release a new version

Releasing a new version is quite simple, as the packages are build and distributed by GitHub Actions.

Terminology:

  • micro release (third version part increases): 1.1.2 -> 1.1.3
  • minor release (second version part increases): 1.1.3 -> 1.2.0
  • major release (first version part increases): 1.2.0 -> 2.0.0

Release steps:

  1. Make sure all dependencies are up to date (especially Rasa SDK)
    • For Rasa SDK that means first creating a new Rasa SDK release (make sure the version numbers between the new Rasa and Rasa SDK releases match)
    • Once the tag with the new Rasa SDK release is pushed and the package appears on pypi, the dependency in the rasa repository can be resolved (see below).
  2. Switch to the branch you want to cut the release from (main in case of a major / minor, the current feature branch for micro releases)
    • Update the rasa-sdk entry in pyproject.toml with the new release version and run poetry update. This creates a new poetry.lock file with all dependencies resolved.
    • Commit the changes with git commit -am "bump rasa-sdk dependency" but do not push them. They will be automatically picked up by the following step.
  3. Run make release
  4. Create a PR against main or the release branch (e.g. 1.2.x)
  5. Once your PR is merged, tag a new release (this SHOULD always happen on main or release branches), e.g. using
    git tag 1.2.0 -m "next release"
    git push origin 1.2.0
    GitHub will build this tag and publish the build artifacts.
  6. If this is a minor release, a new release branch should be created pointing to the same commit as the tag to allow for future patch releases, e.g.
    git checkout -b 1.2.x
    git push origin 1.2.x

Cutting a Micro release

Micro releases are simpler to cut, since they are meant to contain only bugfixes.

The only things you need to do to cut a micro are:

  1. Notify the engineering team on Slack that you are planning to cut a micro, in case someone has an important fix to add.
  2. Make sure the bugfix(es) are in the release branch you will use (p.e if you are cutting a 2.0.4 micro, you will need your fixes to be on the 2.0.x release branch). All micros must come from a .x branch!
  3. Once you're ready to release the Rasa Open Source micro, checkout the branch, run make release and follow the steps + get the PR merged.
  4. Once the PR is in, pull the .x branch again and push the tag!

License

Licensed under the Apache License, Version 2.0. Copyright 2020 Rasa Technologies GmbH. Copy of the license.

A list of the Licenses of the dependencies of the project can be found at the bottom of the Libraries Summary.

Owner
Rasa
Open source machine learning tools for developers to build, improve, and deploy text-and voice-based chatbots and assistants
Rasa
Comments
  • Simple Rasa Core + NLU Rest API delivering responses

    Simple Rasa Core + NLU Rest API delivering responses

    My goal is to wrap the rasa core + nlu bot into an API so that I can call it in some JavaScript-based UI, such as a simple angularjs-chat ( http://angularjs.chat/tutorials/angularjs-basic-chat-module/).

    However, I can only use CustomInput to send a message to "localhost:5005" by POST but I cannot get the response from my bot. Is there any way that I could build/modify on top of the rasa_core? I know that it is already using Flask, so I don’t want to build another Flask app on top of that.

    In summary, I guess it would be great if we can set up an API such as "localhost:5005" to receive the questions from users and send responses back to the same channel, like what we have using the consoleInput.

  • Getting RASA to Work on the Raspberry Pi 4 (Buster)

    Getting RASA to Work on the Raspberry Pi 4 (Buster)

    Rasa version: 1.3.8

    Rasa X version: N/A

    Python version: 3.5

    Operating system (windows, osx, ...): Raspbian Buster

    Issue:

    I am in the process of trying to install Rasa on my Raspberry Pi 4 (4GB RAM edition) and I continue running into an error with installing or-tools stating:

      ERROR: Could not find a version that satisfies the requirement ortools (from mesh-tensorflow->tensor2tensor~=1.14.0->rasa) (from versions: none)
    ERROR: No matching distribution found for ortools (from mesh-tensorflow->tensor2tensor~=1.14.0->rasa)
    

    Any suggestions on any potential work-arounds?

    Here is what I tried so far:

    • I tried to build or-tools on an x86_64 based computer running linux. Once I had it successfully built, I transfered it to the Pi and even though I could import ortools, I could not perform any operations using it.
    • Attempted to build or-tools on the Pi via these instructions but, given that the instructions are two years old and or-tools has changed dramatically, the process fails with the RPi's CPU spiking to 300% before shutting off. I also tried this process on my linux computer and it almost overloaded my 8GB of RAM - so it is a fairly intensive task!

    Another thought was to build a wheel for or-tools that could be compatible with arm7l architecture, but I do not know how feasible this is - it could be bundled with future releases of rasa.



    Replication of Error Procedure

    Here are the steps I took to get to this point; follow it completely to replicate my error:

    1. Freshly install a new version of Raspbian Lite (Buster) on an SD Card

    2. Run the following to update the Raspberry Pi:

      sudo apt-get update && 
      sudo apt-get upgrade -y &&
      sudo apt-get dist-upgrade -y
      
    3. Install Berryconda package manager:

      wget https://github.com/jjhelmus/berryconda/releases/download/v2.0.0/Berryconda3-2.0.0-Linux-armv7l.sh &&
      chmod +x Berryconda3-2.0.0-Linux-armv7l.sh &&
      ./Berryconda3-2.0.0-Linux-armv7l.sh && sudo reboot
      
    4. Add RPI channel into conda:

      conda config --add channels rpi

    5. Install Python3.5 into conda:

      conda install python=3.5 -y

    6. Set-Up conda environment (we set-up with 3.5 so as to not encounter an error with open-cv):

      A. Create conda environment: conda create --name alpha python=3.5 pip -y

      B. Update all packages in conda environment:

      conda update --all -y &&
      source activate alpha &&
      pip install --upgrade pip setuptools wheel
      
    7. Install rasa:

      A. Install required system packages:

      sudo apt-get install libatlas-base-dev python3-dev python3-pip libhdf5-dev -y

      B. Reboot the computer

      sudo reboot

      C. Install rasa

      source activate alpha && pip install rasa

    Error (including full traceback):

    Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple
    Collecting rasa
      Using cached https://files.pythonhosted.org/packages/94/04/183500ce584fc4dce09ccdb80e192f90ae5ba61553ace9e17b3f77c94a04/rasa-1.3.9-py3-none-any.whl
    Collecting python-socketio>=4.3.1 (from rasa)
      Downloading https://files.pythonhosted.org/packages/35/b0/22c3f785f23fec5c7a815f47c55d7e7946a67ae2129ff604148e939d3bdb/python_socketio-4.3.1-py2.py3-none-any.whl (49kB)
    Collecting aiohttp~=3.5 (from rasa)
      Downloading https://files.pythonhosted.org/packages/c2/f7/f0ad3dbace4762fef5d80aa4124b41bf218e4c4dd6d387a86cede707d9a4/aiohttp-3.6.2-py3-none-any.whl (441kB)
    Collecting packaging~=19.0 (from rasa)
      Downloading https://files.pythonhosted.org/packages/cf/94/9672c2d4b126e74c4496c6b3c58a8b51d6419267be9e70660ba23374c875/packaging-19.2-py2.py3-none-any.whl
    Collecting async-generator~=1.10 (from rasa)
      Using cached https://files.pythonhosted.org/packages/71/52/39d20e03abd0ac9159c162ec24b93fbcaa111e8400308f2465432495ca2b/async_generator-1.10-py3-none-any.whl
    Collecting scikit-learn~=0.20.0 (from rasa)
      Using cached https://www.piwheels.org/simple/scikit-learn/scikit_learn-0.20.4-cp35-cp35m-linux_armv7l.whl
    Collecting networkx~=2.3 (from rasa)
      Downloading https://files.pythonhosted.org/packages/85/08/f20aef11d4c343b557e5de6b9548761811eb16e438cee3d32b1c66c8566b/networkx-2.3.zip (1.7MB)
    Collecting tensorflow-probability~=0.7.0 (from rasa)
      Using cached https://files.pythonhosted.org/packages/3e/3a/c10b6c22320531c774402ac7186d1b673374e2a9d12502cbc8d811e4601c/tensorflow_probability-0.7.0-py2.py3-none-any.whl
    Collecting fbmessenger~=6.0 (from rasa)
      Downloading https://files.pythonhosted.org/packages/bd/e9/646684226176782b9e3b7dd5b35d7ecfd1d13cba24ad2e33255079921aab/fbmessenger-6.0.0-py2.py3-none-any.whl
    Collecting tensorflow~=1.14.0 (from rasa)
      Downloading https://www.piwheels.org/simple/tensorflow/tensorflow-1.14.0-cp35-none-linux_armv7l.whl (100.7MB)
    Collecting sanic-jwt~=1.3 (from rasa)
      Downloading https://www.piwheels.org/simple/sanic-jwt/sanic_jwt-1.3.2-py3-none-any.whl
    Collecting SQLAlchemy~=1.3.0 (from rasa)
      Downloading https://www.piwheels.org/simple/sqlalchemy/SQLAlchemy-1.3.10-cp35-cp35m-linux_armv7l.whl (1.2MB)
    Collecting jsonpickle~=1.1 (from rasa)
      Using cached https://files.pythonhosted.org/packages/07/07/c157520a3ebd166c8c24c6ae0ecae7c3968eb4653ff0e5af369bb82f004d/jsonpickle-1.2-py2.py3-none-any.whl
    Collecting pymongo[srv,tls]~=3.8 (from rasa)
      Downloading https://www.piwheels.org/simple/pymongo/pymongo-3.9.0-cp35-cp35m-linux_armv7l.whl (433kB)
    Collecting python-telegram-bot~=11.0 (from rasa)
      Downloading https://files.pythonhosted.org/packages/84/6c/47932a4041ee76650ad1f45a80e1422077e1e99c08a4d7a61cfbe5393d41/python_telegram_bot-11.1.0-py2.py3-none-any.whl (326kB)
    Collecting sanic-cors~=0.9.0 (from rasa)
      Using cached https://files.pythonhosted.org/packages/99/a6/1f2dce79cb38f0c7d8c846f7456ccd2a471d504cedb645b585e25406d737/Sanic_Cors-0.9.9.post3-py2.py3-none-any.whl
    Collecting mattermostwrapper~=2.0 (from rasa)
      Downloading https://files.pythonhosted.org/packages/93/70/203660597d12788e958dd691aa11c3c29caa075eadb2ce94d2eb53099d1b/mattermostwrapper-2.1-py2.py3-none-any.whl
    Collecting tensor2tensor~=1.14.0 (from rasa)
      Using cached https://files.pythonhosted.org/packages/20/eb/7159c5a12880c496893a5bf7242c23fdb4eed47ade683d111f2f2d4009b4/tensor2tensor-1.14.1-py2.py3-none-any.whl
    Collecting absl-py>=0.8.0 (from rasa)
      Downloading https://www.piwheels.org/simple/absl-py/absl_py-0.8.1-py3-none-any.whl (121kB)
    Collecting pykwalify~=1.7.0 (from rasa)
      Downloading https://files.pythonhosted.org/packages/36/9f/612de8ca540bd24d604f544248c4c46e9db76f6ea5eb75fb4244da6ebbf0/pykwalify-1.7.0-py2.py3-none-any.whl (40kB)
    Collecting coloredlogs~=10.0 (from rasa)
      Downloading https://files.pythonhosted.org/packages/08/0f/7877fc42fff0b9d70b6442df62d53b3868d3a6ad1b876bdb54335b30ff23/coloredlogs-10.0-py2.py3-none-any.whl (47kB)
    Requirement already satisfied: setuptools>=41.0.0 in ./berryconda3/envs/alpha/lib/python3.5/site-packages (from rasa) (41.4.0)
    Collecting apscheduler~=3.0 (from rasa)
      Using cached https://files.pythonhosted.org/packages/09/ff/d5b0e81846cd5e92d02e5f2682b78c73a5d9d61bc1eae32cea5ac15c0d47/APScheduler-3.6.1-py2.py3-none-any.whl
    Collecting simplejson~=3.16 (from rasa)
      Using cached https://www.piwheels.org/simple/simplejson/simplejson-3.16.0-cp35-cp35m-linux_armv7l.whl
    Collecting boto3~=1.9 (from rasa)
      Using cached https://files.pythonhosted.org/packages/8c/49/cc8d900df85f69bce673510aacd0473aba958244516829d422720f584632/boto3-1.9.248-py2.py3-none-any.whl
    Collecting pydot~=1.4 (from rasa)
      Using cached https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
    Collecting attrs>=18 (from rasa)
      Using cached https://files.pythonhosted.org/packages/6b/e8/2ecaf86b128a34e225807f03b22664302937ab826bd3b7eccab6754d29ea/attrs-19.2.0-py2.py3-none-any.whl
    Collecting numpy~=1.16 (from rasa)
      Downloading https://www.piwheels.org/simple/numpy/numpy-1.17.2-cp35-cp35m-linux_armv7l.whl (9.1MB)
    Collecting twilio~=6.0 (from rasa)
      Downloading https://files.pythonhosted.org/packages/0b/1e/e5b2acb5f73578aef493ce79cf3bab92f2236a4da9693409fc804caebf63/twilio-6.31.1-py2.py3-none-any.whl (1.0MB)
    Collecting python-dateutil~=2.8 (from rasa)
      Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
    Collecting requests>=2.20 (from rasa)
      Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
    Collecting colorclass~=2.2 (from rasa)
      Using cached https://www.piwheels.org/simple/colorclass/colorclass-2.2.0-py3-none-any.whl
    Collecting rasa-sdk~=1.3.0 (from rasa)
      Downloading https://files.pythonhosted.org/packages/b2/95/ad9f2448be31be381ed2b1b01f0564ac505deac549423fa05a183aa9193b/rasa_sdk-1.3.3-py2.py3-none-any.whl
    Collecting ruamel.yaml~=0.15.0 (from rasa)
      Using cached https://www.piwheels.org/simple/ruamel-yaml/ruamel.yaml-0.15.100-cp35-cp35m-linux_armv7l.whl
    Collecting pika~=1.0.0 (from rasa)
      Using cached https://files.pythonhosted.org/packages/78/1a/28c98ee8b211be21d4a9f4ef1687c4d36f9302d47fcc28b81f9591abf6d8/pika-1.0.1-py2.py3-none-any.whl
    Collecting pytz~=2019.1 (from rasa)
      Using cached https://files.pythonhosted.org/packages/e7/f9/f0b53f88060247251bf481fa6ea62cd0d25bf1b11a87888e53ce5b7c8ad2/pytz-2019.3-py2.py3-none-any.whl
    Collecting python-engineio>=3.9.3 (from rasa)
      Using cached https://files.pythonhosted.org/packages/2b/20/8e3ba16102ae2e245d70d9cb9fa48b076253fdb036dc43eea142294c2897/python_engineio-3.9.3-py2.py3-none-any.whl
    Collecting jsonschema~=2.6 (from rasa)
      Using cached https://files.pythonhosted.org/packages/77/de/47e35a97b2b05c2fadbec67d44cfcdcd09b8086951b331d82de90d2912da/jsonschema-2.6.0-py2.py3-none-any.whl
    Collecting questionary>=1.1.0 (from rasa)
      Using cached https://files.pythonhosted.org/packages/c2/f9/8a6e7fce60566b3bcdc5ad0923916f38a65bca630ce3647251e672308bdf/questionary-1.3.0-py3-none-any.whl
    Collecting slackclient~=1.3 (from rasa)
      Using cached https://www.piwheels.org/simple/slackclient/slackclient-1.3.2-py2.py3-none-any.whl
    Collecting rocketchat-API~=0.6.0 (from rasa)
      Downloading https://files.pythonhosted.org/packages/e2/36/29d991028e2943b6a12abe12e3114a4cce13872283df7c7b784ee4cb59ed/rocketchat_API-0.6.34-py3-none-any.whl
    Collecting terminaltables~=3.1 (from rasa)
      Using cached https://www.piwheels.org/simple/terminaltables/terminaltables-3.1.0-py3-none-any.whl
    Collecting redis~=3.3.5 (from rasa)
      Downloading https://files.pythonhosted.org/packages/cc/ed/c7447328a3d9fb26961ca4ee877629a9514705b9442d3179456cb860c70f/redis-3.3.10-py2.py3-none-any.whl (66kB)
    Collecting sanic~=19.3.1 (from rasa)
      Using cached https://files.pythonhosted.org/packages/90/11/31382617b33f89df0caca396f104628d256689a58a63c75d3e46663a8c8f/sanic-19.3.1-py3-none-any.whl
    Collecting matplotlib~=3.0 (from rasa)
      Downloading https://www.piwheels.org/simple/matplotlib/matplotlib-3.0.3-cp35-cp35m-linux_armv7l.whl (10.9MB)
    Collecting gast==0.2.2 (from rasa)
      Downloading https://www.piwheels.org/simple/gast/gast-0.2.2-py3-none-any.whl
    Collecting colorhash~=1.0 (from rasa)
      Using cached https://files.pythonhosted.org/packages/0e/e1/50dbc513aa74e99eca4c47f2a8206711f0bec436fdddd95eebaf7eaaa1aa/colorhash-1.0.2-py2.py3-none-any.whl
    Collecting sklearn-crfsuite~=0.3.6 (from rasa)
      Downloading https://files.pythonhosted.org/packages/25/74/5b7befa513482e6dee1f3dd68171a6c9dfc14c0eaa00f885ffeba54fe9b0/sklearn_crfsuite-0.3.6-py2.py3-none-any.whl
    Collecting gevent~=1.4 (from rasa)
      Downloading https://www.piwheels.org/simple/gevent/gevent-1.4.0-cp35-cp35m-linux_armv7l.whl (5.6MB)
    Collecting kafka-python~=1.4 (from rasa)
      Downloading https://files.pythonhosted.org/packages/49/c9/9863483a1353700ba87821b4f39085eb18fd1bcbb1e954c697177d67f03f/kafka_python-1.4.7-py2.py3-none-any.whl (266kB)
    Collecting PyJWT~=1.7 (from rasa)
      Using cached https://files.pythonhosted.org/packages/87/8b/6a9f14b5f781697e51259d81657e6048fd31a113229cf346880bb7545565/PyJWT-1.7.1-py2.py3-none-any.whl
    Collecting webexteamssdk~=1.1 (from rasa)
      Downloading https://www.piwheels.org/simple/webexteamssdk/webexteamssdk-1.2-py3-none-any.whl (82kB)
    Collecting scipy~=1.2 (from rasa)
      Downloading https://www.piwheels.org/simple/scipy/scipy-1.3.1-cp35-cp35m-linux_armv7l.whl (37.9MB)
    Collecting tqdm~=4.0 (from rasa)
      Using cached https://files.pythonhosted.org/packages/e1/c1/bc1dba38b48f4ae3c4428aea669c5e27bd5a7642a74c8348451e0bd8ff86/tqdm-4.36.1-py2.py3-none-any.whl
    Collecting six>=1.9.0 (from python-socketio>=4.3.1->rasa)
      Downloading https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
    Collecting multidict<5.0,>=4.5 (from aiohttp~=3.5->rasa)
      Downloading https://www.piwheels.org/simple/multidict/multidict-4.5.2-cp35-cp35m-linux_armv7l.whl (301kB)
    Collecting yarl<2.0,>=1.0 (from aiohttp~=3.5->rasa)
      Downloading https://www.piwheels.org/simple/yarl/yarl-1.3.0-cp35-cp35m-linux_armv7l.whl (227kB)
    Collecting chardet<4.0,>=2.0 (from aiohttp~=3.5->rasa)
      Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)
    Collecting typing-extensions>=3.6.5; python_version < "3.7" (from aiohttp~=3.5->rasa)
      Downloading https://files.pythonhosted.org/packages/27/aa/bd1442cfb0224da1b671ab334d3b0a4302e4161ea916e28904ff9618d471/typing_extensions-3.7.4-py3-none-any.whl
    Collecting async-timeout<4.0,>=3.0 (from aiohttp~=3.5->rasa)
      Downloading https://files.pythonhosted.org/packages/e1/1e/5a4441be21b0726c4464f3f23c8b19628372f606755a9d2e46c187e65ec4/async_timeout-3.0.1-py3-none-any.whl
    Collecting idna-ssl>=1.0; python_version < "3.7" (from aiohttp~=3.5->rasa)
      Downloading https://www.piwheels.org/simple/idna-ssl/idna_ssl-1.1.0-py3-none-any.whl
    Collecting pyparsing>=2.0.2 (from packaging~=19.0->rasa)
      Downloading https://files.pythonhosted.org/packages/11/fa/0160cd525c62d7abd076a070ff02b2b94de589f1a9789774f17d7c54058e/pyparsing-2.4.2-py2.py3-none-any.whl (65kB)
    Collecting decorator>=4.3.0 (from networkx~=2.3->rasa)
      Downloading https://files.pythonhosted.org/packages/5f/88/0075e461560a1e750a0dcbf77f1d9de775028c37a19a346a6c565a257399/decorator-4.4.0-py2.py3-none-any.whl
    Collecting cloudpickle>=0.6.1 (from tensorflow-probability~=0.7.0->rasa)
      Downloading https://files.pythonhosted.org/packages/c1/49/334e279caa3231255725c8e860fa93e72083567625573421db8875846c14/cloudpickle-1.2.2-py2.py3-none-any.whl
    Collecting astor>=0.6.0 (from tensorflow~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/d1/4f/950dfae467b384fc96bc6469de25d832534f6b4441033c39f914efd13418/astor-0.8.0-py2.py3-none-any.whl
    Collecting tensorflow-estimator<1.15.0rc0,>=1.14.0rc0 (from tensorflow~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/3c/d5/21860a5b11caf0678fbc8319341b0ae21a07156911132e0e71bffed0510d/tensorflow_estimator-1.14.0-py2.py3-none-any.whl (488kB)
    Collecting keras-preprocessing>=1.0.5 (from tensorflow~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/28/6a/8c1f62c37212d9fc441a7e26736df51ce6f0e38455816445471f10da4f0a/Keras_Preprocessing-1.1.0-py2.py3-none-any.whl (41kB)
    Collecting protobuf>=3.6.1 (from tensorflow~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/ad/c2/86c65136e280607ddb2e5dda19e2953c1174f9919b557d1d154574481de4/protobuf-3.10.0-py2.py3-none-any.whl (434kB)
    Requirement already satisfied: wheel>=0.26 in ./berryconda3/envs/alpha/lib/python3.5/site-packages (from tensorflow~=1.14.0->rasa) (0.33.6)
    Collecting grpcio>=1.8.6 (from tensorflow~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/14/ea/227e7055604ec319544def1b83a8506f084fc2e985b1ac4b84e04bf2c78e/grpcio-1.24.1-cp35-cp35m-linux_armv7l.whl (12.7MB)
    Collecting tensorboard<1.15.0,>=1.14.0 (from tensorflow~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/91/2d/2ed263449a078cd9c8a9ba50ebd50123adf1f8cfbea1492f9084169b89d9/tensorboard-1.14.0-py3-none-any.whl (3.1MB)
    Collecting termcolor>=1.1.0 (from tensorflow~=1.14.0->rasa)
      Downloading https://www.piwheels.org/simple/termcolor/termcolor-1.1.0-py3-none-any.whl
    Collecting keras-applications>=1.0.6 (from tensorflow~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/71/e3/19762fdfc62877ae9102edf6342d71b28fbfd9dea3d2f96a882ce099b03f/Keras_Applications-1.0.8-py3-none-any.whl (50kB)
    Collecting wrapt>=1.11.1 (from tensorflow~=1.14.0->rasa)
      Downloading https://www.piwheels.org/simple/wrapt/wrapt-1.11.2-cp35-cp35m-linux_armv7l.whl (66kB)
    Collecting google-pasta>=0.1.6 (from tensorflow~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/d0/33/376510eb8d6246f3c30545f416b2263eee461e40940c2a4413c711bdf62d/google_pasta-0.1.7-py3-none-any.whl (52kB)
    Collecting dnspython<2.0.0,>=1.16.0; extra == "srv" (from pymongo[srv,tls]~=3.8->rasa)
      Downloading https://files.pythonhosted.org/packages/ec/d3/3aa0e7213ef72b8585747aa0e271a9523e713813b9a20177ebe1e939deb0/dnspython-1.16.0-py2.py3-none-any.whl (188kB)
    Collecting future>=0.16.0 (from python-telegram-bot~=11.0->rasa)
      Downloading https://www.piwheels.org/simple/future/future-0.18.0-py3-none-any.whl (490kB)
    Requirement already satisfied: certifi in ./berryconda3/envs/alpha/lib/python3.5/site-packages (from python-telegram-bot~=11.0->rasa) (2018.8.24)
    Collecting cryptography (from python-telegram-bot~=11.0->rasa)
      Downloading https://www.piwheels.org/simple/cryptography/cryptography-2.7-cp35-cp35m-linux_armv7l.whl (782kB)
    Collecting sanic-plugins-framework>=0.8.2 (from sanic-cors~=0.9.0->rasa)
      Downloading https://files.pythonhosted.org/packages/ac/a6/258bdd353c22c3ff7f130d1c788f874a88e48d306614d2f622f3bac2576b/Sanic_Plugins_Framework-0.8.2-py2.py3-none-any.whl
    Collecting kfac (from tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/01/f0/4a7758f854a15b37d322827123ce58619d0f4270dd94f2dd30328f397339/kfac-0.2.0-py2.py3-none-any.whl (178kB)
    Collecting gin-config (from tensor2tensor~=1.14.0->rasa)
      Downloading https://www.piwheels.org/simple/gin-config/gin_config-0.2.1-py3-none-any.whl
    Collecting gunicorn (from tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/8c/da/b8dd8deb741bff556db53902d4706774c8e1e67265f69528c14c003644e6/gunicorn-19.9.0-py2.py3-none-any.whl (112kB)
    Collecting flask (from tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/9b/93/628509b8d5dc749656a9641f4caf13540e2cdec85276964ff8f43bbb1d3b/Flask-1.1.1-py2.py3-none-any.whl (94kB)
    Collecting mesh-tensorflow (from tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/f1/63/77b1b1e9a037aed5c6c42a264b6340093ddcf5a3cb06713f5b50bfb966b4/mesh_tensorflow-0.1.0-py2.py3-none-any.whl (221kB)
    Collecting tensorflow-gan (from tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/3e/94/8903150ffafdd538b18e6d24dbd4f5a07105d9d31a29db49e728ae2a8b13/tensorflow_gan-1.0.0.dev0-py2.py3-none-any.whl (317kB)
    Collecting bz2file (from tensor2tensor~=1.14.0->rasa)
      Downloading https://www.piwheels.org/simple/bz2file/bz2file-0.98-py3-none-any.whl
    Collecting dopamine-rl (from tensor2tensor~=1.14.0->rasa)
      Downloading https://www.piwheels.org/simple/dopamine-rl/dopamine_rl-2.0.5-py3-none-any.whl (77kB)
    Collecting gym (from tensor2tensor~=1.14.0->rasa)
      Downloading https://www.piwheels.org/simple/gym/gym-0.15.3-py3-none-any.whl (1.6MB)
    Collecting pypng (from tensor2tensor~=1.14.0->rasa)
      Downloading https://www.piwheels.org/simple/pypng/pypng-0.0.20-py3-none-any.whl (67kB)
    Collecting sympy (from tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/21/21/f4105795ca7f35c541d82c5b06be684dd2f5cb4f508fb487cd7aea4de776/sympy-1.4-py2.py3-none-any.whl (5.3MB)
    Collecting oauth2client (from tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/95/a9/4f25a14d23f0786b64875b91784607c2277eff25d48f915e39ff0cff505a/oauth2client-4.1.3-py2.py3-none-any.whl (98kB)
    Collecting tensorflow-datasets (from tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/6c/34/ff424223ed4331006aaa929efc8360b6459d427063dc59fc7b75d7e4bab3/tensorflow_datasets-1.2.0-py3-none-any.whl (2.3MB)
    Collecting opencv-python (from tensor2tensor~=1.14.0->rasa)
      Downloading https://www.piwheels.org/simple/opencv-python/opencv_python-3.4.4.19-cp35-cp35m-linux_armv7l.whl (7.4MB)
    Collecting google-api-python-client (from tensor2tensor~=1.14.0->rasa)
      Downloading https://www.piwheels.org/simple/google-api-python-client/google_api_python_client-1.7.11-py3-none-any.whl (56kB)
    Collecting Pillow (from tensor2tensor~=1.14.0->rasa)
      WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProtocolError('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))': /simple/pillow/
      Downloading https://www.piwheels.org/simple/pillow/Pillow-6.2.0-cp35-cp35m-linux_armv7l.whl (1.1MB)
    Collecting h5py (from tensor2tensor~=1.14.0->rasa)
      Downloading https://www.piwheels.org/simple/h5py/h5py-2.10.0-cp35-cp35m-linux_armv7l.whl (3.5MB)
    Collecting docopt>=0.6.2 (from pykwalify~=1.7.0->rasa)
      Downloading https://www.piwheels.org/simple/docopt/docopt-0.6.2-py2.py3-none-any.whl
    Collecting PyYAML>=3.11 (from pykwalify~=1.7.0->rasa)
      Downloading https://www.piwheels.org/simple/pyyaml/PyYAML-5.1.2-cp35-cp35m-linux_armv7l.whl (45kB)
    Collecting humanfriendly>=4.7 (from coloredlogs~=10.0->rasa)
      Downloading https://files.pythonhosted.org/packages/90/df/88bff450f333114680698dc4aac7506ff7cab164b794461906de31998665/humanfriendly-4.18-py2.py3-none-any.whl (73kB)
    Collecting tzlocal>=1.2 (from apscheduler~=3.0->rasa)
      Downloading https://files.pythonhosted.org/packages/ef/99/53bd1ac9349262f59c1c421d8fcc2559ae8a5eeffed9202684756b648d33/tzlocal-2.0.0-py2.py3-none-any.whl
    Collecting botocore<1.13.0,>=1.12.248 (from boto3~=1.9->rasa)
      Downloading https://files.pythonhosted.org/packages/50/5c/922cc8c2cdcb905e96bd73dc2064939bc9157e9e1d4c44e54e775cd2cddc/botocore-1.12.248-py2.py3-none-any.whl (5.7MB)
    Collecting s3transfer<0.3.0,>=0.2.0 (from boto3~=1.9->rasa)
      Downloading https://files.pythonhosted.org/packages/16/8a/1fc3dba0c4923c2a76e1ff0d52b305c44606da63f718d14d3231e21c51b0/s3transfer-0.2.1-py2.py3-none-any.whl (70kB)
    Collecting jmespath<1.0.0,>=0.7.1 (from boto3~=1.9->rasa)
      Downloading https://files.pythonhosted.org/packages/83/94/7179c3832a6d45b266ddb2aac329e101367fbdb11f425f13771d27f225bb/jmespath-0.9.4-py2.py3-none-any.whl
    Collecting pysocks; python_version >= "3.0" (from twilio~=6.0->rasa)
      Downloading https://files.pythonhosted.org/packages/8d/59/b4572118e098ac8e46e399a1dd0f2d85403ce8bbaad9ec79373ed6badaf9/PySocks-1.7.1-py3-none-any.whl
    Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.20->rasa)
      Downloading https://files.pythonhosted.org/packages/e0/da/55f51ea951e1b7c63a579c09dd7db825bb730ec1fe9c0180fc77bfb31448/urllib3-1.25.6-py2.py3-none-any.whl (125kB)
    Collecting idna<2.9,>=2.5 (from requests>=2.20->rasa)
      Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)
    Collecting flask-cors~=3.0 (from rasa-sdk~=1.3.0->rasa)
      Downloading https://files.pythonhosted.org/packages/78/38/e68b11daa5d613e3a91e4bf3da76c94ac9ee0d9cd515af9c1ab80d36f709/Flask_Cors-3.0.8-py2.py3-none-any.whl
    Collecting ConfigArgParse~=0.14 (from rasa-sdk~=1.3.0->rasa)
      Downloading https://www.piwheels.org/simple/configargparse/ConfigArgParse-0.15.1-py3-none-any.whl
    Collecting prompt-toolkit~=2.0 (from questionary>=1.1.0->rasa)
      Downloading https://files.pythonhosted.org/packages/87/61/2dfea88583d5454e3a64f9308a686071d58d59a55db638268a6413e1eb6d/prompt_toolkit-2.0.10-py3-none-any.whl (340kB)
    Collecting websocket-client<0.55.0,>=0.35 (from slackclient~=1.3->rasa)
      Downloading https://files.pythonhosted.org/packages/26/2d/f749a5c82f6192d77ed061a38e02001afcba55fe8477336d26a950ab17ce/websocket_client-0.54.0-py2.py3-none-any.whl (200kB)
    Collecting uvloop>=0.5.3; sys_platform != "win32" and implementation_name == "cpython" (from sanic~=19.3.1->rasa)
      Downloading https://www.piwheels.org/simple/uvloop/uvloop-0.13.0-cp35-cp35m-linux_armv7l.whl (3.2MB)
    Collecting httptools>=0.0.10 (from sanic~=19.3.1->rasa)
      Downloading https://www.piwheels.org/simple/httptools/httptools-0.0.13-cp35-cp35m-linux_armv7l.whl (207kB)
    Collecting aiofiles>=0.3.0 (from sanic~=19.3.1->rasa)
      Downloading https://files.pythonhosted.org/packages/cf/f2/a67a23bc0bb61d88f82aa7fb84a2fb5f278becfbdc038c5cbb36c31feaf1/aiofiles-0.4.0-py3-none-any.whl
    Collecting ujson>=1.35; sys_platform != "win32" and implementation_name == "cpython" (from sanic~=19.3.1->rasa)
      Downloading https://www.piwheels.org/simple/ujson/ujson-1.35-cp35-cp35m-linux_armv7l.whl (72kB)
    Collecting websockets<7.0,>=6.0 (from sanic~=19.3.1->rasa)
      Downloading https://www.piwheels.org/simple/websockets/websockets-6.0-cp35-cp35m-linux_armv7l.whl (92kB)
    Collecting cycler>=0.10 (from matplotlib~=3.0->rasa)
      Downloading https://files.pythonhosted.org/packages/f7/d2/e07d3ebb2bd7af696440ce7e754c59dd546ffe1bbe732c8ab68b9c834e61/cycler-0.10.0-py2.py3-none-any.whl
    Collecting kiwisolver>=1.0.1 (from matplotlib~=3.0->rasa)
      Downloading https://www.piwheels.org/simple/kiwisolver/kiwisolver-1.1.0-cp35-cp35m-linux_armv7l.whl (912kB)
    Collecting python-crfsuite>=0.8.3 (from sklearn-crfsuite~=0.3.6->rasa)
      Downloading https://www.piwheels.org/simple/python-crfsuite/python_crfsuite-0.9.6-cp35-cp35m-linux_armv7l.whl (659kB)
    Collecting tabulate (from sklearn-crfsuite~=0.3.6->rasa)
      Downloading https://www.piwheels.org/simple/tabulate/tabulate-0.8.5-py3-none-any.whl
    Collecting greenlet>=0.4.14; platform_python_implementation == "CPython" (from gevent~=1.4->rasa)
      Downloading https://www.piwheels.org/simple/greenlet/greenlet-0.4.15-cp35-cp35m-linux_armv7l.whl (42kB)
    Collecting requests-toolbelt (from webexteamssdk~=1.1->rasa)
      Downloading https://files.pythonhosted.org/packages/60/ef/7681134338fc097acef8d9b2f8abe0458e4d87559c689a8c306d0957ece5/requests_toolbelt-0.9.1-py2.py3-none-any.whl (54kB)
    Collecting markdown>=2.6.8 (from tensorboard<1.15.0,>=1.14.0->tensorflow~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/c0/4e/fd492e91abdc2d2fcb70ef453064d980688762079397f779758e055f6575/Markdown-3.1.1-py2.py3-none-any.whl (87kB)
    Collecting werkzeug>=0.11.15 (from tensorboard<1.15.0,>=1.14.0->tensorflow~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/ce/42/3aeda98f96e85fd26180534d36570e4d18108d62ae36f87694b476b83d6f/Werkzeug-0.16.0-py2.py3-none-any.whl (327kB)
    Collecting asn1crypto>=0.21.0 (from cryptography->python-telegram-bot~=11.0->rasa)
      Downloading https://files.pythonhosted.org/packages/6e/1e/fb0e487b5229e5fb7b15c6d00b4e8082a3414fe62b1da4c9a905b106e672/asn1crypto-1.1.0-py2.py3-none-any.whl (103kB)
    Collecting cffi!=1.11.3,>=1.8 (from cryptography->python-telegram-bot~=11.0->rasa)
      Downloading https://www.piwheels.org/simple/cffi/cffi-1.12.3-cp35-cp35m-linux_armv7l.whl (310kB)
    Collecting click>=5.1 (from flask->tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/fa/37/45185cb5abbc30d7257104c434fe0b07e5a195a6847506c074527aa599ec/Click-7.0-py2.py3-none-any.whl (81kB)
    Collecting Jinja2>=2.10.1 (from flask->tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/65/e0/eb35e762802015cab1ccee04e8a277b03f1d8e53da3ec3106882ec42558b/Jinja2-2.10.3-py2.py3-none-any.whl (125kB)
    Collecting itsdangerous>=0.24 (from flask->tensor2tensor~=1.14.0->rasa)
      Downloading https://files.pythonhosted.org/packages/76/ae/44b03b253d6fade317f32c24d100b3b35c2239807046a4c953c7b89fa49e/itsdangerous-1.1.0-py2.py3-none-any.whl
    Collecting ortools (from mesh-tensorflow->tensor2tensor~=1.14.0->rasa)
      ERROR: Could not find a version that satisfies the requirement ortools (from mesh-tensorflow->tensor2tensor~=1.14.0->rasa) (from versions: none)
    ERROR: No matching distribution found for ortools (from mesh-tensorflow->tensor2tensor~=1.14.0->rasa)
    

    Command which led to error:

    pip install rasa


    If you have any questions, feel free to reach out - I hope we can get this working!

  • got concurrent.futures._base.CancelledError

    got concurrent.futures._base.CancelledError

    Rasa version: 1.1.6-full (docker image)

    Rasa X version (if used & relevant):

    Python version:

    Operating system (windows, osx, ...): docker

    Issue: seems to timeout between rasa and action server. My action takes around 10sec. I don't know why it's timeout, because DEFAULT_REQUEST_TIMEOUT is around 5 minutes.

    My action is well executed from the actions server.

    Error (including full traceback):

    Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/rasa-1.1.6-py3.6.egg/rasa/core/processor.py", line 439, in _run_action events = await action.run(output_channel, nlg, tracker, self.domain) File "/usr/local/lib/python3.6/asyncio/coroutines.py", line 129, in throw return self.gen.throw(type, value, traceback) File "/usr/local/lib/python3.6/site-packages/rasa-1.1.6-py3.6.egg/rasa/core/actions/action.py", line 399, in run json=json_body, method="post", timeout=DEFAULT_REQUEST_TIMEOUT File "/usr/local/lib/python3.6/asyncio/coroutines.py", line 129, in throw return self.gen.throw(type, value, traceback) File "/usr/local/lib/python3.6/site-packages/rasa-1.1.6-py3.6.egg/rasa/utils/endpoints.py", line 144, in request **kwargs File "/usr/local/lib/python3.6/asyncio/coroutines.py", line 129, in throw return self.gen.throw(type, value, traceback) File "/usr/local/lib/python3.6/site-packages/aiohttp/client.py", line 1005, in aenter self._resp = await self._coro File "/usr/local/lib/python3.6/asyncio/coroutines.py", line 129, in throw return self.gen.throw(type, value, traceback) File "/usr/local/lib/python3.6/site-packages/aiohttp/client.py", line 497, in _request await resp.start(conn) File "/usr/local/lib/python3.6/asyncio/coroutines.py", line 129, in throw return self.gen.throw(type, value, traceback) File "/usr/local/lib/python3.6/site-packages/aiohttp/client_reqrep.py", line 844, in start message, payload = await self._protocol.read() # type: ignore # noqa File "/usr/local/lib/python3.6/asyncio/coroutines.py", line 129, in throw return self.gen.throw(type, value, traceback) File "/usr/local/lib/python3.6/site-packages/aiohttp/streams.py", line 588, in read await self._waiter concurrent.futures._base.CancelledError

    Command or request that led to error:

    Content of configuration file (config.yml) (if relevant):

    Content of domain file (domain.yml) (if relevant):

  • "Tried to set non existent slot" error persists after deleting all instances of slots and retraining

    Rasa version: rasa-core==0.13.7 rasa-core-sdk==0.12.2 rasa-nlu==0.14.6

    Python version: Python 3.6.7

    Operating system (windows, osx, ...): Distributor ID: Ubuntu Description: Ubuntu 18.04.2 LTS Release: 18.04 Codename: bionic

    Issue: The slots which were used for previous experiments are not resetting even after removing these slots from all files (nlu.md, stories.md, domain.yml and actions.py) I tried deleting the entire model directory and trained the models again. but still it showing the errors with previously used slot. Please find the error message below.

    2019-04-22 09:56:44 ERROR    rasa_core.trackers  - Tried to set non existent slot 'environment'. Make sure you added all your s
    lots to your domain file.
    2019-04-22 09:56:44 ERROR    rasa_core.trackers  - Tried to set non existent slot 'platform'. Make sure you added all your slot
    s to your domain file.
    

    I just executed a grep command from my project directory to make sure these slots are not using anymore.

    ~/project/slackbot$ grep "environment" * -R
    
    Notebook/Conversational_Chatbot.ipynb:        "# In your environment run:\n",
    (sample_bot_env) [email protected]:~/project/slackbot$ grep "platform" * -R
    Notebook/Conversational_Chatbot.ipynb:            "Requirement already satisfied, skipping upgrade: greenlet>=0.4.14; platform_python_implementation == \"CPython\" in /usr/local/lib/python3.6/dist-packages (from gevent->rasa_nlu[spacy]) (0.4.15)\n",
    credentials.yml:# This file contains the credentials for the voice & chat platforms
    nohup.out:2019-04-03 09:58:46.340534: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
    

    Kindly note that I had used these slots before (not in the current stories) for doing some experiments.

    Thank you,

  • MemoryError with tensorflow_embedding on ~73k dataset with 38 intents

    MemoryError with tensorflow_embedding on ~73k dataset with 38 intents

    As mentioned in the title, I am feeding in ~73k lines of training data classified into 38 intents. And I would end up using ~200k lines of messages to create my final model. But even for 73k, I get a MemoryError. This doesn't seem to be a RAM issue as I don't see my RAM getting fully used up while running the training code. Any inputs would be valuable. Below are the details:

    Rasa NLU version: 0.13.8 Operating system: Windows Server 2016

    Training the model as:

    python -m rasa_nlu.train -c nlu_config.yml --data rasa_classification_train_set.md -o models --fixed_model_name nlu_classify_75k_38ctgy --project current --verbose
    

    Content of model configuration file:

    language: "en"
    
    pipeline: "tensorflow_embedding"
    

    Output / Issue:

    2019-01-14 08:40:41 INFO     rasa_nlu.training_data.loading  - Training data format of rasa_classification_train_set.md is md
    2019-01-14 08:40:43 INFO     rasa_nlu.training_data.training_data  - Training data stats:
            - intent examples: 73962 (38 distinct intents)
    ** removing entity names **
            - entity examples: 0 (0 distinct entities)
            - found entities:
    
    2019-01-14 08:40:46 INFO     rasa_nlu.model  - Starting to train component tokenizer_whitespace
    2019-01-14 08:40:55 INFO     rasa_nlu.model  - Finished training component.
    2019-01-14 08:40:55 INFO     rasa_nlu.model  - Starting to train component ner_crf
    2019-01-14 08:40:55 INFO     rasa_nlu.model  - Finished training component.
    2019-01-14 08:40:55 INFO     rasa_nlu.model  - Starting to train component ner_synonyms
    2019-01-14 08:40:55 INFO     rasa_nlu.model  - Finished training component.
    2019-01-14 08:40:55 INFO     rasa_nlu.model  - Starting to train component intent_featurizer_count_vectors
    Traceback (most recent call last):
      File "C:\Users\harsh.khadloya\AppData\Local\Continuum\Anaconda3\lib\runpy.py", line 184, in _run_module_as_main
        "__main__", mod_spec)
      File "C:\Users\harsh.khadloya\AppData\Local\Continuum\Anaconda3\lib\runpy.py", line 85, in _run_code
        exec(code, run_globals)
      File "C:\Users\harsh.khadloya\AppData\Local\Continuum\Anaconda3\lib\site-packages\rasa_nlu\train.py", line 184, in <module>
        num_threads=cmdline_args.num_threads)
      File "C:\Users\harsh.khadloya\AppData\Local\Continuum\Anaconda3\lib\site-packages\rasa_nlu\train.py", line 154, in do_train
        interpreter = trainer.train(training_data, **kwargs)
      File "C:\Users\harsh.khadloya\AppData\Local\Continuum\Anaconda3\lib\site-packages\rasa_nlu\model.py", line 196, in train
        **context)
      File "C:\Users\harsh.khadloya\AppData\Local\Continuum\Anaconda3\lib\site-packages\rasa_nlu\featurizers\count_vectors_featurizer.py", line 214, in train
        X = self.vect.fit_transform(lem_exs).toarray()
      File "C:\Users\harsh.khadloya\AppData\Local\Continuum\Anaconda3\lib\site-packages\scipy\sparse\compressed.py", line 947, in toarray
        out = self._process_toarray_args(order, out)
      File "C:\Users\harsh.khadloya\AppData\Local\Continuum\Anaconda3\lib\site-packages\scipy\sparse\base.py", line 1184, in _process_toarray_args
        return np.zeros(self.shape, dtype=self.dtype, order=order)
    MemoryError
    

    During this runtime, I dont see my RAM getting used up more than 6GB, even though I have a 16GB RAM. Thanks for your help!

  • Upgrade to TF 2.6

    Upgrade to TF 2.6

    TF 2.4 is out and hence we should update the dependency inside Rasa OS to use it. Introduces some breaking changes as mentioned in the changelog

    Update: we are now targeting 2.6 see here

    Note that there's a draft PR here that already contains some of the necessary changes. Known remaining tasks in order to actually update code for TF 2.5 are (this list contains only the known remaining tasks, there may be others hence it is not an exhaustive of all remaining tasks):

    • [x] Run model regression tests on all configs and datasets to see if training times / accuracy are the same
    • [x] see TODOs marked with 2.5 (ex: here)
    • [x] #9433
    • [x] #9512

    Temporarily de-prioritised followups:

    • [ ] #9514
    • [ ] #9515
    • [ ] #9516
    • [ ] #9764
    • [ ] https://github.com/RasaHQ/rasa/issues/9666
    • [ ] https://github.com/RasaHQ/rasa/issues/9798
    • [ ] https://github.com/RasaHQ/rasa/issues/9734

    Definition of done ~No memory tests failing~ ~No increased timeouts for tests~ ~Model regression tests passing with approx. same time and accuracy~

    • [X] Followup issue created: determine w DevRel how to communicate hits to community https://github.com/RasaHQ/research/issues/220
    • [X] Followup issue created: determine w CSE how to communicate hits to customers https://github.com/RasaHQ/research/issues/219
    • [X] Followup issue created: update CUDA drivers on GPU runners to be compatible with TF 2.5 : https://github.com/RasaHQ/rasa/issues/9611
    • [X] Model regression tests on GPU have approx. same accuracy
    • [X] Lockfile reflects correct TF dependency
    • [X] Create a PR for the bumped up dependency
    • [X] Create followup issue to investigate and address memory problems on Windows CI tests, so that this fix can be reverted https://github.com/RasaHQ/rasa/issues/9734
    • [X] Create followup issue to investigate and address memory problems for Hermit CI model regression tests so that they can be run on schedule again https://github.com/RasaHQ/rasa/issues/9798
  • Wrong classification for words the model has never seen?

    Wrong classification for words the model has never seen?

    I'm having difficulty to understand why my model behaves in a certain way:

    My models are trained with a big but quite homogeneous dataset such as "give me some advice" (intent: advice) or "tell me a joke" (intent: joke). The trained model works very well for similar queries. However, when seeing new phrases and/or words such as "apple" or "banana" that are obviously neither advice or joke, they still get classified as either advice or joke intent with a very high confidence (>85%).

    Q: Any idea why this is happening? And do you have any advice how I can do this better?

    What I have tried:

    • I have trained a fallback intent to include some of the wrong ways to get advice or joke intent. I understand that I could also add "apple" or "banana" to the list of things that should be classified as a fallback intent. However, to include all possible English words doesn't sound like a smart strategy.
    • Also might be relevant is that my dataset consists of some 20,000 examples for each advice and joke intent, and the dataset for fallback intent is only around 20 examples. The imbalance in the data might contribute to the issue.
  • I want to add google's bert as a new feature

    I want to add google's bert as a new feature

    I have tested Google's BERT as a text classifier, it performances very well in my company. So, I want to migrate BERT's power to RASA NLU. I'm workding on it recently, does anyone else interest?

  • Exploration of new docs build system

    Exploration of new docs build system

    Proposed changes:

    • Exploration of new docs build system

    • related to https://github.com/RasaHQ/growth/issues/1436

    • not all pages have been added to the new system. all of them are converted, if you are missing a page you'd like to take a look at you need to add it to newdocs/sidebars.js

    To view the page, navigate into newdocs, run yarn to install dependencies and then yarn start. A browser page with the docs should pop up.

    The converter and necessary changes to fix some of the markdown issues can be found here https://github.com/RasaHQ/sphinx-markdown-builder/pull/1 - should be extended and adapted to fix more issues.

    How to build the MDX docs based on RST

    All of the new docs pages have been created automatically. If there are any changes in the RST docs (e.g. by new features or changes merged into master), the MDX docs can be regenerated:

    • checkout and install https://github.com/RasaHQ/sphinx-markdown-builder/pull/1 using git clone git://github.com/RasaHQ/sphinx-markdown-builder.git, git checkout convert-docs and pip install -e . (this should be in your rasa poetry env, e.g. try running poetry shell before the pip install)
    • checkout this branch and cd docs and run make markdown

    This will tell sphinx to build the docs and output markdown (instead of the usual HTML). The converter is pretty flexible and if we notice any issues, we can further customize the sphinx-markdown-builder based on the above PR.

    While converting, sphinx will create somewhat of an AST and then walk through these nodes. for each node in the doc there is a hook that decides how to convert that node into markdown.

    Remaining work before this can be merged is tracked in https://github.com/RasaHQ/rasa/issues/6165

    TODO

    • [x] move newdocs to docs. rename existing occurrences, e.g. https://github.com/RasaHQ/rasa/pull/6043/files#diff-793229551c14ebd7f4c0f9236d054ee0R40 077755869f71a9771855193a0c1ebb3b88908c54 and 4b180773a7eca14f344c44727b5793369bb8c0c2
    • [x] remove FIXMEs in scripts/push_docs_to_branch.sh script 4b180773a7eca14f344c44727b5793369bb8c0c2
    • [x] fix /docs/ links everywhere f758baf68b9
    • [x] support top-level CHANGELOG in the docs (similar setup than variables.json) 968e4a1b418
    • [x] fix make docs (failing because of redoc)
    • [x] setup Netlify redirection (see https://github.com/RasaHQ/rasa-website/pull/763)
    • [x] links appear to be broken after last alpha update e298750892f
    • [x] prepare prototyper move https://github.com/RasaHQ/rasa/pull/6298

    Fixes https://github.com/RasaHQ/rasa/issues/6170

  • Rasa command line interface

    Rasa command line interface

    Goal

    • intuitive
    • short commands (if default configuration)
    • reuse existing knowledge
    • unify interface for Rasa Core / NLU

    General Idea

    Assume a rasa project with a default project layout. This means we can assume the location and name of files (e.g. domain.yml is always in the project root and has this name). By looking in default locations for files the user can skip their declaration. If users deviate from the normal project layout, they have to specify the locations manually.

    Assumed project layout

    .
    ├── __init__.py
    ├── actions.py
    ├── endpoints.yml
    ├── credentials.yml
    ├── data
    ├── domain.yml
    ├── models
    │   ├── core
    │   └── nlu
    ├── config.yml (contains both core and nlu config)
    

    Suggested command line commands:

    Creating the initial project (new feature)

    • rasa init + interactive guide to create project (e.g. provide project name, etc)

    Training

    • Training of Core and NLU: rasa train
    • Training of the core model: rasa train core
    • Training of the nlu model:rasa train nlu
    • Interactive Training: rasa interactive (or rasa train interactive?)
    • Special case compare polices: rasa train core --config policy1.yml policy2.yml

    Validating

    • test both: rasa test
    • test only core: rasa test core
    • test only nlu: rasa test nlu

    Run Server

    • Start Core with trained NLU model: rasa run
    • Run Core server: rasa run core
    • Run NLU server: rasa run nlu
    • Run action server: rasa run sdk (should we include the sdk or keep that separate?)

    Visualize

    • Show story paths: rasa show (alternatives: rasa inspect, rasa visualize)

    Data

    There is currently just one command in this section, but we could have other commands like rasa data validate <file> or rasa data split here in the future

    • Convert nlu data: rasa data nlu convert

    Interacting with the server

    rasa remote

    Crucial

    • provide credentials: rasa remote login <host> (where / how do we store them)
    • select environment (prod / dev / test): rasa remote stages <env>
    • set model as active: rasa remote models select <model-id>
    • pushing models: rasa remote models push <model>
    • downloading models: rasa remote models pull <model>
    • push data: rasa remote data push
    • pull data: rasa remote data pull

    Nice to have

    • list models: rasa remote models list
    • list users: rasa remote users list
    • status : rasa remote status
    • logs: rasa remote logs
    • show config: rasa remote config

    Interacting with local server

    rasa local commands as above for server except

    • no login
    • pushing / downloading are not needed (set as active should be enough)

    Other

    • rasa --version
    • rasa --help

    Parameter Names

    • tbd, general idea is to keep them like they are, but unify them between Core and NLU

    Current Status

    • [x] rasa init
    • [x] rasa run
    • [x] rasa train
    • [x] rasa test
    • [x] rasa show
    • [ ] rasa remote
    • [ ] rasa local
    • [x] rasa data

    How to test

    https://github.com/RasaHQ/rasa_stack/issues/1#issuecomment-462871671

  • Slack triggers multiple messages to rasa (does not happen with Telegram/Commandline)

    Slack triggers multiple messages to rasa (does not happen with Telegram/Commandline)

    Rasa Core version: 0.9.0a5

    Python version: 3.6

    Operating system (windows, osx, ...): ubuntu 18.04

    Issue: Slack triggers multiple messages to rasa and NO, this does not happen with Telegram and Commandline Interface (!) with the very same trained RASA_NLU and RASA_CORE models (!!) Please, Rasa-Team: Try to reproduce that with any kind of story!

    Content of domain file (if used & relevant):

    
    
  • Correct & clarify parts of API spec

    Correct & clarify parts of API spec

    Proposed changes:

    GET /status endpoint: Correct response schema for model_id - a string, not an object.

    GET /conversations/{conversation_id}/tracker : Describe each of the enum options for include_events query parameter

    POST & PUT /conversations/{conversation_id}/tracker/events : Events schema added for each event type

    GET /conversations/{conversation_id}/story: Clarified the all_sessions query parameter and default behaviour. Also fixed the story returned when a session is ended by a restarted event, so that the story is not blank.

    POST /model/test/intents : Remove JSON option, since the server was interpreting all input as YAML regardless of content-type header.

    POST /model/parse: Explain what emulation_mode is and how it affects response results

    Status (please check what you already did):

    • [ ] added some tests for the functionality
    • [x] updated the documentation
    • [ ] updated the changelog (please check changelog for instructions)
    • [ ] reformat files using black (please check Readme for instructions)
  • Validate incoming JWT tokens from the bot framework

    Validate incoming JWT tokens from the bot framework

    Proposed changes:

    • ...

    Status (please check what you already did):

    • [ ] added some tests for the functionality
    • [ ] updated the documentation
    • [ ] updated the changelog (please check changelog for instructions)
    • [ ] reformat files using black (please check Readme for instructions)
  • prepared release of version 2.8.28

    prepared release of version 2.8.28

    Proposed changes:

    • #11122

    Status (please check what you already did):

    • [ ] added some tests for the functionality
    • [ ] updated the documentation
    • [ ] updated the changelog (please check changelog for instructions)
    • [ ] reformat files using black (please check Readme for instructions)
  • failed to create cublas handle: CUBLAS_STATUS_ALLOC_FAILED

    failed to create cublas handle: CUBLAS_STATUS_ALLOC_FAILED

    Rasa Open Source version

    3.1.0

    Rasa SDK version

    3.1.1

    Rasa X version

    No response

    Python version

    3.9

    What operating system are you using?

    Windows

    What happened?

    Run rasa train

    get error.

    Command / Request

    No response

    Relevant log output

    022-05-18 14:07:39.437065: E tensorflow/stream_executor/cuda/cuda_blas.cc:226] failed to create cublas handle: CUBLAS_STATUS_ALLOC_FAILED
    2022-05-18 14:07:39.437600: E tensorflow/stream_executor/cuda/cuda_blas.cc:226] failed to create cublas handle: CUBLAS_STATUS_ALLOC_FAILED
    2022-05-18 14:07:39.437838: E tensorflow/stream_executor/cuda/cuda_blas.cc:226] failed to create cublas handle: CUBLAS_STATUS_ALLOC_FAILED
    Traceback (most recent call last):
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\engine\graph.py", line 
    464, in __call__
        output = self._fn(self._component, **run_kwargs)
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 920, in train
        self.model.fit(
      File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 67, in error_handler
        raise e.with_traceback(filtered_tb) from None
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\utils\tensorflow\temp_keras_modules.py", line 388, in fit
        tmp_logs = self.train_function(iterator)
    tensorflow.python.framework.errors_impl.InternalError: 2 root error(s) found.  (0) INTERNAL:  Attempting to perform BLAS operation using StreamExecutor without BLAS support
             [[node embed_label/embed_layer_label/MatMul
     (defined at C:\Repo\rasa-demo\venv\lib\site-packages\keras\layers\core\dense.py:199)
    ]]
             [[Func/crf/cond/StatefulPartitionedCall/crf/cond/else/_236/input/_538/_362]]
      (1) INTERNAL:  Attempting to perform BLAS operation using StreamExecutor without BLAS support
             [[node embed_label/embed_layer_label/MatMul
     (defined at C:\Repo\rasa-demo\venv\lib\site-packages\keras\layers\core\dense.py:199)
    ]]
    0 successful operations.
    0 derived errors ignored. [Op:__inference_train_function_48041]
    
    Errors may have originated from an input operation.
    Input Source operations connected to node embed_label/embed_layer_label/MatMul:
    In[0] Sum_1 (defined at C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py:1493)
    In[1] embed_label/embed_layer_label/MatMul/ReadVariableOp:
    
    Operation defined at: (most recent call last)
    >>>   File "C:\Users\i\AppData\Local\Programs\Python\Python39\lib\runpy.py", 
    line 197, in _run_module_as_main
    >>>     return _run_code(code, main_globals, None,
    >>>
    >>>   File "C:\Users\i\AppData\Local\Programs\Python\Python39\lib\runpy.py", 
    line 87, in _run_code
    >>>     exec(code, run_globals)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\Scripts\rasa.exe\__main__.py", line 7, in 
    <module>
    >>>     sys.exit(main())
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\__main__.py", line 
    119, in main
    >>>     cmdline_arguments.func(cmdline_arguments)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\cli\train.py", line 59, in <lambda>
    >>>     train_parser.set_defaults(func=lambda args: run_training(args, can_exit=True))
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\cli\train.py", line 91, in run_training
    >>>     training_result = train_all(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\api.py", line 105, 
    in train
    >>>     return train(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\model_training.py", line 160, in train
    >>>     return _train_graph(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\model_training.py", line 234, in _train_graph
    >>>     trainer.train(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\engine\training\graph_trainer.py", line 105, in train
    >>>     graph_runner.run(inputs={PLACEHOLDER_IMPORTER: importer})
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\engine\runner\dask.py", line 101, in run
    >>>     dask_result = dask.get(run_graph, run_targets)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 553, in get_sync
    >>>     return get_async(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 495, in get_async
    >>>     fire_tasks(chunksize)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 490, in fire_tasks
    >>>     fut = submit(batch_execute_tasks, each_args)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 538, in submit
    >>>     fut.set_result(fn(*args, **kwargs))
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 234, in batch_execute_tasks
    >>>     return [execute_task(*a) for a in it]
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 234, in <listcomp>
    >>>     return [execute_task(*a) for a in it]
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 220, in execute_task
    >>>     result = _execute_task(task, data)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\core.py", line 119, in _execute_task
    >>>     return func(*(_execute_task(a, cache) for a in args))
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\engine\graph.py", line 464, in __call__
    >>>     output = self._fn(self._component, **run_kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 920, in train
    >>>     self.model.fit(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 64, in error_handler
    >>>     return fn(*args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\utils\tensorflow\temp_keras_modules.py", line 388, in fit
    >>>     tmp_logs = self.train_function(iterator)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\engine\training.py", line 878, in train_function
    >>>     return step_function(self, iterator)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\engine\training.py", line 867, in step_function
    >>>     outputs = model.distribute_strategy.run(run_step, args=(data,))      
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\engine\training.py", line 860, in run_step
    >>>     outputs = model.train_step(data)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\utils\tensorflow\models.py", line 144, in train_step
    >>>     prediction_loss = self.batch_loss(batch_in)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 1622, in batch_loss
    >>>     loss = self._batch_loss_intent(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 1661, in _batch_loss_intent
    >>>     loss, acc = self._calculate_label_loss(sentence_vector, label, label_ids)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 1559, in _calculate_label_loss
    >>>     all_label_ids, all_labels_embed = self._create_all_labels()
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 1510, in _create_all_labels
    >>>     all_labels_embed = self._tf_layers[f"embed.{LABEL}"](x)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 64, in error_handler
    >>>     return fn(*args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\engine\base_layer.py", line 1083, in __call__
    >>>     outputs = call_fn(inputs, *args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 92, in error_handler
    >>>     return fn(*args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\utils\tensorflow\layers.py", line 464, in call
    >>>     x = self._dense(x)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 64, in error_handler
    >>>     return fn(*args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\engine\base_layer.py", line 1083, in __call__
    >>>     outputs = call_fn(inputs, *args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 92, in error_handler
    >>>     return fn(*args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\layers\core\dense.py", line 199, in call
    >>>     outputs = tf.matmul(a=inputs, b=self.kernel)
    >>>
    
    Input Source operations connected to node embed_label/embed_layer_label/MatMul:
    In[0] Sum_1 (defined at C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py:1493)
    In[1] embed_label/embed_layer_label/MatMul/ReadVariableOp:
    
    Operation defined at: (most recent call last)
    >>>   File "C:\Users\i\AppData\Local\Programs\Python\Python39\lib\runpy.py", 
    line 197, in _run_module_as_main
    >>>     return _run_code(code, main_globals, None,
    >>>
    >>>   File "C:\Users\i\AppData\Local\Programs\Python\Python39\lib\runpy.py", 
    line 87, in _run_code
    >>>     exec(code, run_globals)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\Scripts\rasa.exe\__main__.py", line 7, in 
    <module>
    >>>     sys.exit(main())
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\__main__.py", line 
    119, in main
    >>>     cmdline_arguments.func(cmdline_arguments)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\cli\train.py", line 59, in <lambda>
    >>>     train_parser.set_defaults(func=lambda args: run_training(args, can_exit=True))
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\cli\train.py", line 91, in run_training
    >>>     training_result = train_all(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\api.py", line 105, 
    in train
    >>>     return train(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\model_training.py", line 160, in train
    >>>     return _train_graph(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\model_training.py", line 234, in _train_graph
    >>>     trainer.train(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\engine\training\graph_trainer.py", line 105, in train
    >>>     graph_runner.run(inputs={PLACEHOLDER_IMPORTER: importer})
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\engine\runner\dask.py", line 101, in run
    >>>     dask_result = dask.get(run_graph, run_targets)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 553, in get_sync
    >>>     return get_async(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 495, in get_async
    >>>     fire_tasks(chunksize)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 490, in fire_tasks
    >>>     fut = submit(batch_execute_tasks, each_args)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 538, in submit
    >>>     fut.set_result(fn(*args, **kwargs))
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 234, in batch_execute_tasks
    >>>     return [execute_task(*a) for a in it]
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 234, in <listcomp>
    >>>     return [execute_task(*a) for a in it]
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 220, in execute_task
    >>>     result = _execute_task(task, data)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\core.py", line 119, in _execute_task
    >>>     return func(*(_execute_task(a, cache) for a in args))
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\engine\graph.py", line 464, in __call__
    >>>     output = self._fn(self._component, **run_kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 920, in train
    >>>     self.model.fit(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 64, in error_handler
    >>>     return fn(*args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\utils\tensorflow\temp_keras_modules.py", line 388, in fit
    >>>     tmp_logs = self.train_function(iterator)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\engine\training.py", line 878, in train_function
    >>>     return step_function(self, iterator)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\engine\training.py", line 867, in step_function
    >>>     outputs = model.distribute_strategy.run(run_step, args=(data,))      
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\engine\training.py", line 860, in run_step
    >>>     outputs = model.train_step(data)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\utils\tensorflow\models.py", line 144, in train_step
    >>>     prediction_loss = self.batch_loss(batch_in)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 1622, in batch_loss
    >>>     loss = self._batch_loss_intent(
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 1661, in _batch_loss_intent
    >>>     loss, acc = self._calculate_label_loss(sentence_vector, label, label_ids)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 1559, in _calculate_label_loss
    >>>     all_label_ids, all_labels_embed = self._create_all_labels()
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\nlu\classifiers\diet_classifier.py", line 1510, in _create_all_labels
    >>>     all_labels_embed = self._tf_layers[f"embed.{LABEL}"](x)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 64, in error_handler
    >>>     return fn(*args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\engine\base_layer.py", line 1083, in __call__
    >>>     outputs = call_fn(inputs, *args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 92, in error_handler
    >>>     return fn(*args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\utils\tensorflow\layers.py", line 464, in call
    >>>     x = self._dense(x)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 64, in error_handler
    >>>     return fn(*args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\engine\base_layer.py", line 1083, in __call__
    >>>     outputs = call_fn(inputs, *args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\utils\traceback_utils.py", line 92, in error_handler
    >>>     return fn(*args, **kwargs)
    >>>
    >>>   File "C:\Repo\rasa-demo\venv\lib\site-packages\keras\layers\core\dense.py", line 199, in call
    >>>     outputs = tf.matmul(a=inputs, b=self.kernel)
    >>>
    
    Function call stack:
    train_function -> train_function
    
    
    The above exception was the direct cause of the following exception:
    
    Traceback (most recent call last):
      File "C:\Users\i\AppData\Local\Programs\Python\Python39\lib\runpy.py", line 197, in _run_module_as_main
        return _run_code(code, main_globals, None,
      File "C:\Users\i\AppData\Local\Programs\Python\Python39\lib\runpy.py", line 87, in _run_code
        exec(code, run_globals)
      File "C:\Repo\rasa-demo\venv\Scripts\rasa.exe\__main__.py", line 7, in <module>
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\__main__.py", line 119, in main
        cmdline_arguments.func(cmdline_arguments)
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\cli\train.py", line 59, in <lambda>
        train_parser.set_defaults(func=lambda args: run_training(args, can_exit=True))
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\cli\train.py", line 91, in run_training
        training_result = train_all(
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\api.py", line 105, in train
        return train(
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\model_training.py", line 160, in train
        return _train_graph(
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\model_training.py", line 234, in _train_graph
        trainer.train(
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\engine\training\graph_trainer.py", line 105, in train
        graph_runner.run(inputs={PLACEHOLDER_IMPORTER: importer})
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\engine\runner\dask.py", line 101, in run
        dask_result = dask.get(run_graph, run_targets)
      File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 553, in get_sync
        return get_async(
      File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 496, in get_async
        for key, res_info, failed in queue_get(queue).result():
      File "C:\Users\i\AppData\Local\Programs\Python\Python39\lib\concurrent\futures\_base.py", line 438, in result
        return self.__get_result()
      File "C:\Users\i\AppData\Local\Programs\Python\Python39\lib\concurrent\futures\_base.py", line 390, in __get_result
        raise self._exception
      File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 538, in submit
        fut.set_result(fn(*args, **kwargs))
      File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 234, in batch_execute_tasks
        return [execute_task(*a) for a in it]
      File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 234, in <listcomp>
        return [execute_task(*a) for a in it]
      File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 225, in execute_task
        result = pack_exception(e, dumps)
      File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\local.py", line 220, in execute_task
        result = _execute_task(task, data)
      File "C:\Repo\rasa-demo\venv\lib\site-packages\dask\core.py", line 119, in 
    _execute_task
        return func(*(_execute_task(a, cache) for a in args))
      File "C:\Repo\rasa-demo\venv\lib\site-packages\rasa\engine\graph.py", line 
    471, in __call__
        raise GraphComponentException(
    rasa.engine.exceptions.GraphComponentException: Error running graph component for node train_DIETClassifier5.
    
  • Bump actions/setup-node from 2.3.0 to 3.2.0

    Bump actions/setup-node from 2.3.0 to 3.2.0

    Bumps actions/setup-node from 2.3.0 to 3.2.0.

    Release notes

    Sourced from actions/setup-node's releases.

    Add current, node, latest aliases

    In scope of this release we added new aliases to install the latest Node.js version. actions/setup-node#483

    steps:
    - uses: actions/[email protected]
    - uses: actions/[email protected]
      with:
        node-version: current
    - run: npm ci
    - run: npm test
    

    Update actions/cache version to 2.0.2

    In scope of this release we updated actions/cache package as the new version contains fixes related to GHES 3.5 (actions/setup-node#460)

    Add caching support on GHES 3.5

    In scope of this release we added support for caching from GHES 3.5 and fixed download issue for files > 2GB during restore. Besides, we updated actions/cache dependency to 2.0.0 version.

    v3.0.0

    In scope of this release we changed version of the runtime Node.js for the setup-node action and updated package-lock.json file to v2.

    Breaking Changes

    Fix logic of error handling for npm warning and uncaught exception

    In scope of this release we fix logic of error handling related to caching (actions/setup-node#358) and (actions/setup-node#359).

    In the previous behaviour we relied on stderr output to throw error. The warning messages from package managers can be written to the stderr's output. For now the action will throw an error only if exit code differs from zero. Besides, we add logic to сatch and log unhandled exceptions.

    Adding Node.js version file support

    In scope of this release we add the node-version-file input and update actions/cache dependency to the latest version.

    Adding Node.js version file support

    The new input (node-version-file) provides functionality to specify the path to the file containing Node.js's version with such behaviour:

    • If the file does not exist the action will throw an error.
    • If you specify both node-version and node-version-file inputs, the action will use value from the node-version input and throw the following warning: Both node-version and node-version-file inputs are specified, only node-version will be used.
    • For now the action does not support all of the variety of values for Node.js version files. The action can handle values according to the documentation and values with v prefix (v14)
    steps:
      - uses: actions/[email protected]
      - name: Setup node from node version file
        uses: actions/[email protected]
        with:
          node-version-file: '.nvmrc'
      - run: npm install
      - run: npm test
    

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
  • Bump actions/github-script from 4.0.2 to 6.1.0

    Bump actions/github-script from 4.0.2 to 6.1.0

    Bumps actions/github-script from 4.0.2 to 6.1.0.

    Release notes

    Sourced from actions/github-script's releases.

    v6.1.0

    What's Changed

    New Contributors

    Full Changelog: https://github.com/actions/github-script/compare/v6.0.0...v6.1.0

    v6.0.0

    What's Changed

    Breaking Changes

    With the update to Node 16 in #235, all scripts will now be run with Node 16 rather than Node 12.

    New Contributors

    Full Changelog: https://github.com/actions/github-script/compare/v5...v6.0.0

    v5.1.0

    What's Changed

    New Contributors

    Full Changelog: https://github.com/actions/github-script/compare/v5.0.0...v5.1.0

    v5.0.0

    What's Changed

    Breaking Changes

    As part of this update, the Octokit context available via github no longer has REST methods directly. These methods are available via github.rest.* - https://github.com/octokit/plugin-rest-endpoint-methods.js/releases/tag/v5.0.0

    See https://github.com/actions/github-script#breaking-changes-in-v5

    ... (truncated)

    Commits
    • 7a5c598 Merge pull request #263 from smaeda-ks/update-actions-core
    • cb1c1eb Classify http-client licenses
    • 6203d71 Update licenses
    • 19fe498 Update @actions/core to 1.8.1
    • 9bd6ae6 Merge pull request #254 from dlech/patch-1
    • e44260d README: use pull_request_target in example
    • 0541812 Merge pull request #251 from actions/dependabot/npm_and_yarn/minimist-1.2.6
    • b82abb9 Merge pull request #252 from josh-/add-formatting-example-readme
    • d965d37 Add text formatting example to README
    • 7cf7d15 Bump minimist from 1.2.5 to 1.2.6
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
💬 Open source machine learning framework to automate text- and voice-based conversations: NLU, dialogue management, connect to Slack, Facebook, and more - Create chatbots and voice assistants
💬   Open source machine learning framework to automate text- and voice-based conversations: NLU, dialogue management, connect to Slack, Facebook, and more - Create chatbots and voice assistants

Rasa Open Source Rasa is an open source machine learning framework to automate text-and voice-based conversations. With Rasa, you can build contextual

Feb 18, 2021
Code for ACL 2021 main conference paper "Conversations are not Flat: Modeling the Intrinsic Information Flow between Dialogue Utterances".
Code for ACL 2021 main conference paper

Conversations are not Flat: Modeling the Intrinsic Information Flow between Dialogue Utterances This repository contains the code and pre-trained mode

May 15, 2022
An open source library for deep learning end-to-end dialog systems and chatbots.
An open source library for deep learning end-to-end dialog systems and chatbots.

DeepPavlov is an open-source conversational AI library built on TensorFlow, Keras and PyTorch. DeepPavlov is designed for development of production re

May 20, 2022
An open source library for deep learning end-to-end dialog systems and chatbots.
An open source library for deep learning end-to-end dialog systems and chatbots.

DeepPavlov is an open-source conversational AI library built on TensorFlow, Keras and PyTorch. DeepPavlov is designed for development of production re

May 20, 2022
An open source library for deep learning end-to-end dialog systems and chatbots.
An open source library for deep learning end-to-end dialog systems and chatbots.

DeepPavlov is an open-source conversational AI library built on TensorFlow, Keras and PyTorch. DeepPavlov is designed for development of production re

Feb 18, 2021
May 12, 2022
Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech tagging and word segmentation.

Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech tagging and word segmentation.

May 15, 2022
Parrot is a paraphrase based utterance augmentation framework purpose built to accelerate training NLU models
Parrot is a paraphrase based utterance augmentation framework purpose built to accelerate training NLU models

Parrot is a paraphrase based utterance augmentation framework purpose built to accelerate training NLU models. A paraphrase framework is more than just a paraphrasing model.

May 16, 2022
Repository of the Code to Chatbots, developed in Python

Description In this repository you will find the Code to my Chatbots, developed in Python. I'll explain the structure of this Repository later. Requir

Jan 9, 2022
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.

ParlAI (pronounced “par-lay”) is a python framework for sharing, training and testing dialogue models, from open-domain chitchat, to task-oriented dia

May 19, 2022
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.

ParlAI (pronounced “par-lay”) is a python framework for sharing, training and testing dialogue models, from open-domain chitchat, to task-oriented dia

Feb 18, 2021
This project converts your human voice input to its text transcript and to an automated voice too.

Human Voice to Automated Voice & Text Introduction: In this project, whenever you'll speak, it will turn your voice into a robot voice and furthermore

Oct 15, 2021
A Survey of Natural Language Generation in Task-Oriented Dialogue System (TOD): Recent Advances and New Frontiers
 A Survey of Natural Language Generation in Task-Oriented Dialogue System (TOD): Recent Advances and New Frontiers

A Survey of Natural Language Generation in Task-Oriented Dialogue System (TOD): Recent Advances and New Frontiers

May 15, 2022
A modular framework for vision & language multimodal research from Facebook AI Research (FAIR)

MMF is a modular framework for vision and language multimodal research from Facebook AI Research. MMF contains reference implementations of state-of-t

May 21, 2022
A python script to prefab your scripts/text files, and re create them with ease and not have to open your browser to copy code or write code yourself
A python script to prefab your scripts/text files, and re create them with ease and not have to open your browser to copy code or write code yourself

Scriptfab - What is it? A python script to prefab your scripts/text files, and re create them with ease and not have to open your browser to copy code

Jul 28, 2021
Knowledge Management for Humans using Machine Learning & Tags
Knowledge Management for Humans using Machine Learning & Tags

HyperTag helps humans intuitively express how they think about their files using tags and machine learning. Represent how you think using tags. Find what you look for using semantic search for your text documents (yes, even PDF's) and images.

Apr 27, 2022
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.

Kashgari Overview | Performance | Installation | Documentation | Contributing ?? ?? ?? We released the 2.0.0 version with TF2 Support. ?? ?? ?? If you

May 18, 2022