What is it?

OpenTalk2016 is quite simple for now: First you pick a nick to login, THEN select a person to chat with.

See for yourself https://opentalk2016.herokuapp.com. Just pick a name :D

OpenTalk2016 is a personal DIY chat app project meant for exploring the capability of Tornado Web Framework to hold multiple HTTP connections. Specifically, in the context of a PaaS provider, such as Heroku, there are several questions to be explored:

  • How long can a connection can be before a timeout?

  • How does the number connections affect the application’s performance?

  • Is Tornado + Python a good tool for real-time applications

  • How to deal with intermittent network connectivity on mobile devices?

What I have learned

  1. Server Sent Event (aka, sse) can be used as an alternative for WebSocket where data is more likely to originate from the server, not the other way around. The API is also simpler:

 - The constructor EventSource(url) opens a HTTP connection, expecting the response to be of content type text/event-stream.

 - I ran into issues with the onopen event, which is supposed to be fired after the connection has been established but never worked for me. As a reasonable work around, I have the server send an ack signal which sse clients should listen for instead.

 - Sending signal and receiving is quite trivial. On server side, the signal generating code is a Python generator loop, yielding self.flush() to push signal to the client. Each signal packet has the a few standard fields in key: value format, a single line break delimits the fields, a double line break terminates the packet. Fields can be event, data, id. The event field indicates the name of the event which a client should listen on. For example:


client.addEventListener('ack', fnCallback);

function fnCallback (message) { // where message: {data: string, event: string}
}

Client would expect the field name to be ack. Value for data field should be a text presentation of the transmitted data (JSON encoded, base64, HTML, plain text…)

  1. Tornado IOLoop, gen.coroutine and Python future: As I mention above, each signal generating request handler is essentially a generator loop, yielding self.flush(). However, to make it non-blocking, we ought to annotate the method with gen.coroutine.

# class ChannelHandler

def get(self):

   # ...

   while True:

     json_encoded_data = json_encode(db_access())

     self.write('event: inbox\ndata: {}'.format(json_encoded_data))

     yield self.flush()

     # Something should go here

   # ...

But the trick is such a loop would drain the CPU and flood the network with empty/repeated data json_encoded_data. This loop needs to wait for a signal from outside the loop, basically telling it that new data is ready. As a chat app, the best candidate for this signal is when a chat message is posted and to be delivered the connected users.

TO BE CONTINUED


(image credit: huffpost.com)