As a team of 6, we created and improved tools that make it easy to build streaming isomorphic web applications. When complete, these tools will have a significant impact on web performance. As far as their mentors (Justin and Matthew) know, this will be the first isomorphic web framework that supports streaming.
-
See a video of a prototype app we build this week, using the tools we built this week on youtube. Here's another video with 500 items.
-
A more in-depth technical overview of the work can be found in the master issue.
Terms:
-
isomorphic web applications - Also known as universal web application, isomorphic web applications mean that the same web application and run both client-side and server side. These applications provide server-side rendering (SSR) of a web application, for quick initial performance, and then load the web application into the browser for fast subsequent performance.
-
server-side rendering (SSR) - Server side rendering means taking a JavaScript web app that can run in the browser, loading it in a server environement (typically NodeJS), and using it to produce an initial HTML response. This results in faster performance because the user can immediately see the content.
Where a traditional web app experience has multiple steps before a user sees meaningful content:
A server side rendered application user sees their content immediately:
-
streaming - A streaming application is able to make use of partial data or content as it gets it. Browsers are able to process an HTML response stream. For example, if the following HTML is sent to the browser, it is able to download
style.css
while it waits on the remaining content:<html> <head> <link rel="stylesheet" type="text/css" href="style.css"> </head> <body> <h1>Start of Content</h1> <ul> <!--- PENDING CONTENT -->
Our work will utilize this ability of the browser by being able to flush out parts of a virtual DOM that have finished rendering. A rough flow of the server-side rendering algorithim looks like:
An outline of the rendering algorthim can be found here.
-
- A ndjson stream transformer:
fetch("todos.ndjson") .then( (response) => { return ndjsonStream( response.body ); }).then( (todosStream) => { ... });
-
- A streaming can-connect behavior:
// get a streaming list of todos Todo.getList({}).then(function(todos){ template({ todos: todos }); });
<!-- template updates as todos are returned --> <ul> {{#each todos}} <li>{{name}}</li> {{/each}} </ul>
-
var serialize = require('vdom-streaming-serializer'); http.createServer(function(request, response){ var document = makeDocument(); // ... ADD APP TO DOCUMENT ... myApp(document); // send back HTML as it is "completed" var stream = serialize(document.documentElement); stream.pipe(response); });
donejs-streaming-dev-server
We were able to complete 4 of the 6 issues in the Streamable apps proposal. We created 3 new open source projects in the process. Remaining is:
- can-view-live-streamable, a project that makes live-bound DOM elements streamable.
- can-observation-zone which registers future asynchronous state changes and returns a 'completed' notification callback.
These projects will complete the streaming feature in DoneJS, making it one of the first frameworks to have an end-to-end streaming solution.
The following lists the contributions of each member on the team:
Juncheng Tang ([email protected])
Adding demo test to the testcase
Add a demo testcase to the serializer function to see if the function can work properly.
Create Async Recursive Testcases
Add an async recursive test to the serializer function to test if it can work for multiple recursive async nodes.
[Set up PostgreSQL Environment and update ReadMe] (donejs/demo-streaming-dev-server#2)
Set up the postgreSql environment for the demo and update the readme for guidance of other developers.
[Generate Database and Connect to server] (donejs/demo-streaming-dev-server#3)
Generate a large database todos using bash shell, which include id, text two column. Then connect the PostgreSQL using pg library onto the local server, and test the shell.
[Generate emoji] (donejs/demo-streaming-dev-server#8)
Add a new column in the database for emoji. Generate random emoji decimal code in the database and change demo page to show all the emojis.
Yu-Lin Yang's Contributions ([email protected] [email protected])
I assisted in the generation of the project followed by creating multiple tests to detect problems that the serializer would not function correctly. This, in turn, shed light on new issues which I addressed, such as HTML attribute issues as well as making the program more efficient by omitting calling .documentElement on the testing side. The original issue for this contribution project can be found here. All the code and changes in the pull requests below were written by me but the project was collaborated heavily to assist me in ensuring cohesion and consistency across the project (especially in readme.md).
I initiated the first commit and set up the project environment to the defined goals and tasks for this Hackathon. This included setting up the travis.yml file as well as updating the readme.md. The issue can be found here.
I created multiple tests to ensure the consistency of the serializer including tests to to verify: multiple lists, nested lists, different HTML attribute tags, and improper HTML formatting. The pull request can be found here.
I replaced simply adding attribute with the actual type and value of the specific attribute in a given html tag, which can be found here
var attr;
for (var i = 0; i < element.attributes.length; i++) {
attr = element.attributes[i];
buffer += (" "+attr.name+" = "+"'"+attr.value+"'");
}
buffer += '>';
The code above also omits the need for calling .documentElement on document elements by iterating through its attribute rather than the element itself. The pull request can be found here.
I updated the readme.md file to include a how-to-use and tutorial section of the serializer as well as provide an example of a generated vdom running the tests. The readme.md also reflected the work and and changes my teammates made. Lastly, I added a npm badge as well as a travis badge to show version requirement. The pull request can be found here.
Fang Lu ([email protected])
During this weekend at HackIllinois, I worked on implementing two new components of the canjs project with the canjs and donejs developers. These two components work as npm modules, and we started new repositories for them.
The goal of our projects are to make webpages load in a more efficient: we stream data so that the browser can render whatever is available as soon as possible and do not have to wait for the request to completely finish before performing the task. The two components I worked on is related to the newline delimited JSON (NDJSON) stream.
This component creates a readable JSON object stream from the a fetch response stream.
We did not exactly work on an issue. This is part of the donejs workflow and we are implementing this feature from scratch.
In developing this project, I mainly dealt with asynchronous tasks and readable streams, both of which are new, experimental web technologies. Fortunately I am somewhat familiar with them. It took me the most efforts to sort out the correct order of asynchronous executions and chain them correctly.
Our mentor/canjs maintainer, teammates and I worked in conjunction on this project. I was responsible for the main coding. The project is almost fully completed at the current version, tested and documented, available on npm. The linger issues are failures with the Travis CI systems.
Since the repository is new, I directly commited to master
This components works with can-connect
to support observable lists with data from ndjson-stream
: the list grows and notifies its observers as ndjson-stream
streams more and more data.
We did not exactly work on an issue. This is part of the donejs workflow and we are implementing this feature from scratch.
In developing this project, I created an interface that connects a ndjson-stream
to a DefinedList
(observable list). This project depends on existing canjs
code, so it took me a while to understanding how canjs handles the tasks and how internal calls are made.
Our mentor/canjs maintainer and I worked in conjunction on this project while other teammates are working on testing and documenting the can-ndjson-stream
. I was responsible for the main coding as well as testing. The project is published to npm, but the documentation and CI testing is incomplete yet.
Since the repository is new, I directly commited to master
Indira Gutierrez ([email protected])
- Refactored the serialize method for vdom-streaming-serializer to have only one while loop and serialize html elements asynchronously using a depth-first search approach.
[Updated a Server Route] (https://github.com/donejs/demo-streaming-dev-server/commit/3bfd654c0a1b1771ed8c859b95994cb2860d6745)
- Updated a server route to return all elements f a postgres table after getting all the data instead of one at a time
##Siyao Wu ([email protected])
I decided to take on decreasing the latency of data transmission in web applications. This was logged as
The solution was created in conversation with Justin that we can process data streamingly instead of in discrete steps. We can develop a ndjsonstream() function to convert a ReadableStream of raw ndjson data into a ReadableStream of JavaScript objects to make data transmission faster.
My contribution is to debug the ndjsonstream() function, document and integrate within CanJS with Justin. https://github.com/canjs/can-ndjson-stream/commit/2e0fdfd85a0c742ac22c3b1de6ff2567829868a1 https://github.com/canjs/can-ndjson-stream/commit/b32940b4a6db533730de8349396e9fa7f308e9cb
All my teammates, especially Fang worked hard and did a really great job and our mentor Justin was really talented, hard-working and willing to help us when we were in need.