Use streaming JSON to reduce latency on mobile

beau | 189 points

I agree that a streaming response is cool, but why the dismissal of returning valid JSON, streamed? Why invent a new protocol when JSON already exists? Streaming JSON parsers aren't unicorns, they are horses (sorry).

With this new-line-delimited JSON format all your clients HAVE to know about your new protocol. They have to stream the response bytes, split on new lines, unescape new lines in the payload (how are we doing that, btw?), etc. If a client doesn't care about streaming, it can't just sit on the response and parse it when it's done coming in. Or, how about if later on you upgrade the system so that the response is instant and streaming is no longer necessary? Then you move on to a new API and have to keep supporting this old streaming-but-not-really endpoint forever.

pkulak | 6 years ago

Somewhat related is the JSON Lines "spec": http://jsonlines.org/

rhacker | 6 years ago

I have nothing but praise to this service. It's fast and efficient, and does the job without bloat.

For instance, their iOS app weighs 888.8 KB! When it's common for simple apps to be 50 MB monsters, it's very refreshing to use something that has been developed with proper care.

slig | 6 years ago

You say that using websockets was less reliable than streaming HTTPS, can you elaborate why? In my experience websockets are perfect to use for the use case you described, are there disadvantages?

zeger | 6 years ago

As someone who actually regularly uses a slow mobile connection (8 kilobytes per second!) with somewhat high ping (~90ms to Google), please don't do this thinking you're making my life significantly better. It barely makes a difference in performance once loaded, and the initial load time is ridiculously worse. I'd much rather you make your page work without JavaScript, kept the design light (as this page has otherwise done), and make your CSS cacheable.

Right now, it takes over 5 seconds(!!) for this page to load because of all the freaking JavaScript it has to download! With JS off, the page loads almost immediately. With a keep-alive connection, subsequent loads over HTTPS are not particularly long, unlike what this article seems to think. (Hacker News is one of the FASTEST sites I can access, for example. Even on my crappy connection, pages load nearly instantly.)

Simply letting me type, press enter, and wait 0.1~0.3 seconds for a new page response would not be a significantly worse experience -- however, due to the way the site is written, search doesn't work AT ALL with JS disabled.

So, lots of engineering effort (compared to just serving up a new page) for little to no actual speed improvement, and a more brittle website that breaks completely on unusual configurations... Yeah. Please don't do this!

Figs | 6 years ago

Really interesting stuff! What about simply opening a Websocket connection and using that to for all requests if connection latency is such an issue?

erikrothoff | 6 years ago

I did this a few years ago before I knew it was a "thing" and felt really proud that it actually worked.

The use-case was we had a slow database query for basically map pins. The first ones pins come back in milliseconds, but the last ones would take seconds. The UI was vastly improved by streaming the data instead of waiting for it all to finish, and the server code was easy to implement.

A different delimiter would have worked, but newlines are easy to see in a debugger.

fenwick67 | 6 years ago

I'd like to see this streaming JSON parser incorporated into GraphQL clients: http://oboejs.com

tuukkah | 6 years ago

Any benefit to using this over Server-Sent Events? (other than IE/Edge support)

MonkeyDan | 6 years ago

I think websockets is a much better use case for this if you don't want the reconnection overhead. Also since websockets are bidirectional, you can keep the connection open and send all requests through the connection as well as receive responses from the connection. Also you can send binary on websockets if you want to save bandwidth as well. We do this at work and it works pretty nicely.

iamd3vil | 6 years ago

How is this handled from a UI perspective? As more applications are built around the idea of streaming data, I've found that UI elements tend to jump around, and I find myself clicking/tapping the wrong item more and more, because the item which had been under my thumb/cursor has jumped away just before I could activate it.

bshacklett | 6 years ago

Isn't this pretty similar to how you would use WebSocket frames to transfer individual JSON elements when a client is subscribed?

At one job I had several years ago we came up with the same idea and use \n separated JSON elements as a streaming response. We also tossed around the idea of using WebSockets to stream large responses between services.

Osiris | 6 years ago

Let's talk about reliability: The network is unrealiable; Firewalls might be broken, packets are dropped, IP-addresses may change, cellphones lose connection in subway tunnels. Simply calling "streaming reliable" without even defining what the "reliability" is protecting again, makes "reliable" an overstatement.

IMHO the most reliable way to get data from point A to point B is likely by having a client actively polling for data, using a strict socket timeout. Data should be at-least once delivered. If JSONS should be called anything remotely "reliable" as periodically polling, at least it should have a strict timeout (not mentioned in the article) for receiving the next newline & it should handle replaying of non-acked messages. Otherwise I would call it far from "reliable".

JensRantil | 6 years ago

Does anyone know a JSON parser that parses an ArrayBuffer instead of strings? [1]

JSON.parse() only accepts strings.

The library that the article recommends also uses XMLHttpRequest with strings. [2]

The reason I'm asking is the maximum string length in 32-bit Chrome.

[1]: https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequ...

[2]: https://github.com/eBay/jsonpipe/blob/master/lib/net/xhr.js

jannes | 6 years ago

If the concern is HTTPS overhead, why not use HTTP/2 and send multiple requests?

I think streaming would be useful only if the responses are stateful and it's hard to share it across requests.

sajal83 | 6 years ago

Feels to me like the response content type shouldn't be "application/json" anymore (it is what's returned on that first example).

delaaxe | 6 years ago

Is there an advantage to using chunked streams like this over using WebSockets like this: https://github.com/wybiral/hookah (this will take newlines from a programs stdout and send them over WebSockets, even aggregating multiple streams into one).

wybiral | 6 years ago

I feel like I'm missing something here. Isn't that the point of using a JSON SAX parser instead of a DOM parser?

gumby | 6 years ago

How is this different than SSE https://html.spec.whatwg.org/multipage/comms.html#server-sen... ? Or, why would someone choose this over SSE?

cwt137 | 6 years ago

Why not just gzip the JSON? Should make complicated JSON around an order of magnitude smaller, and be more portable to boot.

nategri | 6 years ago

Why would you use this over web sockets?

osrec | 6 years ago

Should have (2017) in the title.

stringham | 6 years ago