Can I Host a MS Bot Framework Node.js Instance On-Premises
Asked Answered
W

3

6

We've built an MS Bot Framework bot that consumes our existing, internal, on-premises APIs during conversations. We'd like to release this bot by dropping a Web Chat Component into the DOM of our existing, internally-facing, on-premises application.

With our existing architecture, naturally, we want to host this bot internally too--to leverage all our existing configuration and deployment processes. We understand that, regardless, the bot will have to communicate with LUIS--which is fine by us; it doesn't require the more complex (larger attack surface, less central IT buy-in) setup of Azure connecting directly to our internal business data API.

I think this diagram makes it more clear:

Bot Hosting Configurations

Can we achieve what's depicted in the bottom hosting configuration?

EDIT 1: Can we also host the direct line or a similar connector on-premises without having to write a custom connector? Additionally, can we chat with our bot over such a connector without having to write a custom chat component/widget for the DOM? (The web chat component would work just fine as long as it's pointed at our channel.)

The end goal here is to get all of our chat traffic to stay on-premises because this is a data-driven chatbot serving sensitive numbers. It will take less time to redevelop this in another framework that can run wholly on-premises than get approval from our central IT.

Side Note: I'm aware of the Azure Stack Preview. The minimum hardware requirements (and probably subscription costs too) are extreme overkill. (We're talking about a single Node app, after all.)

This is not a duplicate of this question because this question also addresses the key element of direct/line connector on-prem hosting where the other question assumed that the connector would still run on Azure.

Wisla answered 31/1, 2017 at 16:42 Comment(2)
Possible duplicate of Bot Framework without Azure possible?Hogfish
Updated question to address the difference between the two.Wisla
L
2

First of all any chatbot is going to be the program that runs along with the NLP, Its the NLP that brings the knowledge to the chatbot. NLP lies on the hands of the Machine learning techniques.

There are few reasons why the on premise chatbots are less.

  • We need to build the infrastructure
  • We need to train the model often

But using the cloud based NLP may not provide the data privacy and security and also the flexibility of including my business logic is very less.All together going to the on premise or on cloud is based on the needs and the use case of the requirements. How ever please refer this link for end to end knowledge on building the chatbot on premise with very few steps and easily and fully customisable and with all open stack frameworks and tools (Botkit, RASA etc).

This also explains how to host the BOT framework on premise.

Complete On-Premise and Fully Customisable Chat Bot - Part 1 - Overview (https://creospiders.blogspot.com/2018/03/complete-on-premise-and-fully.html) Complete On-Premise and Fully Customisable Chat Bot - Part 2 - Agent Building Using Botkit (https://creospiders.blogspot.com/2018/03/complete-on-premise-and-fully_16.html) Complete On-Premise and Fully Customisable Chat Bot - Part 3 - Communicating to the Agent that has been built (https://creospiders.blogspot.com/2018/04/CompleteOn-PremiseandFullyCustomisableChatBotpart3.html) Complete On-Premise and Fully Customisable Chat Bot - Part 4 - Integrating the Natural Language Processor NLP (https://creospiders.blogspot.com/2018/07/complete-on-premise-and-fully.html)

Landon answered 28/12, 2018 at 16:25 Comment(1)
This is irrelevant, @PranavKAndro. The question is exclusively about the API and target deployment environment for the Microsoft Bot Framework.Wisla
A
1

I am currently facing a similar architectural dilemma. From what we've managed to establish - in principle, yes. How? A bot is just a web service. You can deploy it anywhere you want, but you will have to have another web service to intermediate between the bot framework app and a client - a custom connector.

If you want to use various connector services of the Bot Service (web chat, Skype, Slack), you have to deploy to Azure though.

If you want to connect to some of these channels from on-premise - again you need to write your own connectors.

How to write a connector? Taking a peek at how bot emulator application from Microsoft simulates the "DirectLine API" is a good start, and in my particular context we may do it, if we exhaust legal/security avenues to deploy to Azure.

Astrogeology answered 31/1, 2017 at 17:55 Comment(4)
I was afraid of that. I figured in principle there must be a way to write a custom connector without using an Azure-hosted DirectLine, since, like you mention, that's what the emulator does. It seems like such a simple use case. However, I'm not committed to building a fork of a dev tool (the emulator) meant for a product that's in-preview. It's not really worth buying into a particular chatbot platform if we have to write and maintain a forked middleware that ostensibly should come out of the box.Wisla
"If you want to use various connector services of the Bot Service (web chat, Skype, Slack), you have to deploy to Azure though." This statement is not true. The Bot can be deployed anywhere (following the prereq for that, see this #40888989) and still talk with the other channels. In fact, you can use a bot locally (without deploying anywhere) and via ngrok, use it on the channels, to for example, test things outHogfish
@EzequielJadib: You are correct that it's possible to run the bot locally (or anywhere) and connect to it from elsewhere via ngrok and a connector (on Azure). I should have clarified further: Can I host my bot on-premise and host a connector on-premise such that all of the traffic between my internally-facing webapp and my bot occurs on premise, without having to write a custom chat component/widget for my webapp's DOM?Wisla
@EzequielJadib thanks for the link; I actually did not know that; I need to think whether this simplifies my use case and helps me take a final architectural decision; at the very least it keeps the microsoft-hosted connectors a viable optionAstrogeology
S
1

We are doing a similar work - jira-journal - and we use ngrok to host the bot. The bot resides on-premise and uses the ngrok to expose the endpoint which we have updated on the bot portal.

The pain point we face currently is that in case we rehost the bot, we get a new endpoint which we have to go and update on bot portal :(

Spunky answered 2/7, 2017 at 11:13 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.